Nassau Community College sophomore Nathan Cheong said he's found ChatGPT...

Nassau Community College sophomore Nathan Cheong said he's found ChatGPT to be valuable when researching. "Self control is necessary. There's a natural temptation to let it do everything for you," he said. Credit: Newsday/Alejandra Villa Loarca

Stony Brook University graduate student Brandon Banarsi recalls that around this time last year, students spoke about artificial-intelligence tools such as ChatGPT in low, secretive tones.

"It was something underground," Banarsi said about ChatGPT, the app introduced in November 2022 that can produce human-sounding essays, solve math problems or craft computer code from a single written request.

 "Academia was trying to control it and put boundaries around it, thinking it was detrimental to students," he added.

Much has changed since. The fear that struck many college professors who viewed ChatGPT as a digital cheating device has calmed considerably, Long Island educators say. Local college officials say they're focusing on integrating such artificial intelligence, or AI, into learning experiences, not banning it as some schools had done around the country.

"Now there is so much awareness about AI It's talked about all the time," said Banarsi, who is studying electrical engineering and who participated in a recent Stony Brook forum on AI.

The emergence of AI has created a bevy of headlines and debate as to whether, in the long run, these programs will be good for humanity. 

College officials are still figuring out the role of AI, discovering what works, what doesn't and what to worry about. Many educators say banning it would be futile and that, considering the growing importance of AI in industry and culture, students need to know how to use it in their lives and jobs.

Stony Brook University issued a set of guidelines for teachers at the end of the spring semester on AI writing bots. The guidelines allow instructors to decide whether their students can use them, and urge teachers to clarify for students the proper and improper uses.

Molloy University in Rockville Centre is offering professional development for professors on AI.

"Initially there was a lot of fear and concern for what it would mean for education," Molloy Professor Mubina Schroeder said. "I don't see that panic now."

Suffolk County Community College updated its policies in the fall semester to prohibit students from using material generated by AI tools without authorization from an instructor, officials said. The new policy is included in the syllabus received by every student for every course. The college also held a seminar on  "teaching in an era of artificial intelligence," said Irene Rios, interim vice president for academic affairs.

Hofstra University in Hempstead dedicated a seven-page cover story to AI in its recent campus magazine. The story touts advancements, from a new engineering facility focused on robotics and machine learning, to nursing labs where students work on mannequins that can bleed, have babies and even die.

Molloy's Schroeder said she believes AI will revolutionize teaching and learning.

"Every person might have an AI assistant, a personal tutor, a personal career coach," she said. 

Stony Brook writing and rhetoric Professor Roger Thompson not only allows his students to use AI, he assigns it.

"I actively make it part of my class, instead of seeing it as an obstacle," Thompson said. "I hope I'm not wrong."

Thompson said he's placed guardrails around using AI tools such as ChatGPT. For one recent paper, he instructed students to have AI create their first draft. He sees that as a starting point for them to get ideas, not produce a final product.

Students then broke into small groups where they workshopped their ideas as they wrote a second and third draft, he said. By taking these steps, Thompson said, he sees students putting in the work.

"I think it's opening doors to where they might not normally go," he said.

Some professors say they're using the apps to help students study, draft outlines for projects and bring complicated concepts down to understandable terms.

But by no means is every college class immersed in AI The adoption is spotty and tends toward those courses that are writing-intensive, such as English and history, educators said.

Some 56% of college students nationally say they have used AI on assignments or exams, according to a survey in the fall by BestColleges, a website that ranks colleges. A similar survey by the group in March found only 22% said they used it.

ChatGPT, the most widely used and controversial chatbot, has gone viral, boasting some 180 million users and spurring creation of other bots such as Bard, Bing Chat and YouChat. 

Professors say they're still concerned about students cheating with AI, prompting some to rethink how they test and assess students. Schroeder, a professor in Molloy's School of Education and Human Services, said she has brought back the old "blue books" for more in-class writing assignments. She's also assigned writing topics that call on students to express more personal observations and experiences, reducing the chances of students using AI.

"When you assign that, you can better assess what they're doing," she said.

Schroeder said she sees great promise in AI's ability to help struggling students. She pointed to Khan Academy, an educational nonprofit that produces free video lessons and practice exercises on math, sciences, history and literature.

She's also using it to make herself a better teacher, she said.

"If there's a student who's disengaged, I'll ask the AI to role play the role of the student and come up with assignments that are interesting," she said.

She added, "Sometimes, if an assignment is confusing, I'll ask AI, 'Please think of yourself as a student in this class and tell me what you find confusing about this assignment.' And it can help me rethink an assignment." 

Banarsi, the Stony Brook graduate student, said he recalled little, if any, discussion about ChatGPT in his classes in the spring semester. This past semester, he asked an instructor for help with creating some computer code and the teacher gave him permission to ask ChatGPT.

"ChatGPT created an outline and a framework for the code, and gave me some skeleton basics to work with," Banarsi, 23, said. "I worked off that to create my own code."

Nathan Cheong, a sophomore at Nassau Community College, said he doesn't use ChatGPT for schoolwork — he likes his writing to be his own — but he's found it valuable in personal research. For example, Cheong hopes to pursue a career in architecture and was interested in finding alternatives to concrete.

He could have typed the question into Google but he'd have to sift through a multitude of websites for information, he said. He could have gone to the library to do research but that would take even longer.

The chatbot spit out an answer right away. It saved time and it helped provide new ideas, he said. 

"I was able to understand something I didn't understand before," Cheong said. At the same time, he added, "Self control is necessary. There's a natural temptation to let it do everything for you. ... I didn't allow it to take control."

Students are learning lessons about ChatGPT, such as the bot's propensity to make things up or, as some in the tech world call it, "hallucinate."

"ChatGPT tries to come up with an answer even if it doesn't know. It's not always accurate," said Nistha Boghra, a junior journalism major at Stony Brook.

In her journalism class, Boghra said, the bot helped her pick topics for an assignment to write a story on solutions to some societal problems.

"I don't think I want to use it more than I am," she said. "I want to continue to challenge myself." 

Not all college professors are convinced their students should use AI.

Lizzie McCormick, an English professor at Suffolk County Community College, said she does not allow her students to use it in writing assignments.

"I don't think it's going to help them," McCormick said. "I've re-embraced the old-fashioned basics."

Nonetheless, she still has to deal with it. 

McCormick said the emergence of writing bots such as ChatGPT have encouraged instructors to create unique assignments that focus on lesser-known authors and uncommon questions. For example, she might ask students to craft an essay on a novel looking at the perspective of a minor character.

Educators pick up on a student's writing style, she said, and papers produced by AI tend to have a flat, generic style of composition. She said she's graded hundreds of essays since the chatbots emerged last year, and she's only seen a handful where she sees signs of AI plagiarism.

"I'll tell the student that an AI detector said there was a 60% chance that it is entirely AI And I'll ask them, 'What do you think about that? Do you have a draft or an outline?' "

Professors say they've learned their own lessons on the limitations of AI Several said they've worked with apps designed to identify AI-generated plagiarism, but they don't work very well. The problem is that an essay generated by, say, ChatGPT combines information on the web into original prose, making plagiarism difficult to spot.

The speed with which AI is moving into higher education reflects the pace it is moving into everyday lives, from receiving personal suggestions on films from Netflix to depositing a check by scanning it with your phone, educators say.

Thompson, the Stony Brook professor, said he sees society moving from "the digital age to the AI age."

The potential for AI to advance education is limitless, Schroeder said.

"You still need the human connection," she said. But, she added, "Some people are calling it the 'robot emancipator,' that it will finally help education overcome the obstacles we've faced for decades."

Stony Brook University graduate student Brandon Banarsi recalls that around this time last year, students spoke about artificial-intelligence tools such as ChatGPT in low, secretive tones.

"It was something underground," Banarsi said about ChatGPT, the app introduced in November 2022 that can produce human-sounding essays, solve math problems or craft computer code from a single written request.

 "Academia was trying to control it and put boundaries around it, thinking it was detrimental to students," he added.

Much has changed since. The fear that struck many college professors who viewed ChatGPT as a digital cheating device has calmed considerably, Long Island educators say. Local college officials say they're focusing on integrating such artificial intelligence, or AI, into learning experiences, not banning it as some schools had done around the country.

WHAT TO KNOW

  • The fear that many college professors felt toward artificial-intelligence tools such as ChatGPT — that it would be a digital cheating device — has calmed considerably, say Long Island educators.
  • These days, local college officials say they're focusing on integrating such artificial intelligence into learning experiences, not banning it as some schools had done around the country.
  • Some professors are calling AI the "robot emancipator," in that it will help educators overcome the obstacles they've faced for decades.

"Now there is so much awareness about AI It's talked about all the time," said Banarsi, who is studying electrical engineering and who participated in a recent Stony Brook forum on AI.

The emergence of AI has created a bevy of headlines and debate as to whether, in the long run, these programs will be good for humanity. 

College officials are still figuring out the role of AI, discovering what works, what doesn't and what to worry about. Many educators say banning it would be futile and that, considering the growing importance of AI in industry and culture, students need to know how to use it in their lives and jobs.

Stony Brook University issued a set of guidelines for teachers at the end of the spring semester on AI writing bots. The guidelines allow instructors to decide whether their students can use them, and urge teachers to clarify for students the proper and improper uses.

Molloy University in Rockville Centre is offering professional development for professors on AI.

"Initially there was a lot of fear and concern for what it would mean for education," Molloy Professor Mubina Schroeder said. "I don't see that panic now."

Molloy University Professor Mubina Schroeder teaches a class of future educators...

Molloy University Professor Mubina Schroeder teaches a class of future educators on the use of ChatGPT in December. Credit: Jeff Bachner

Suffolk County Community College updated its policies in the fall semester to prohibit students from using material generated by AI tools without authorization from an instructor, officials said. The new policy is included in the syllabus received by every student for every course. The college also held a seminar on  "teaching in an era of artificial intelligence," said Irene Rios, interim vice president for academic affairs.

Hofstra University in Hempstead dedicated a seven-page cover story to AI in its recent campus magazine. The story touts advancements, from a new engineering facility focused on robotics and machine learning, to nursing labs where students work on mannequins that can bleed, have babies and even die.

Molloy's Schroeder said she believes AI will revolutionize teaching and learning.

"Every person might have an AI assistant, a personal tutor, a personal career coach," she said. 

AI as a teaching tool

Stony Brook writing and rhetoric Professor Roger Thompson not only allows his students to use AI, he assigns it.

"I actively make it part of my class, instead of seeing it as an obstacle," Thompson said. "I hope I'm not wrong."

Thompson said he's placed guardrails around using AI tools such as ChatGPT. For one recent paper, he instructed students to have AI create their first draft. He sees that as a starting point for them to get ideas, not produce a final product.

Students then broke into small groups where they workshopped their ideas as they wrote a second and third draft, he said. By taking these steps, Thompson said, he sees students putting in the work.

"I think it's opening doors to where they might not normally go," he said.

Molloy students at work last month. Some 56% of college students nationally...

Molloy students at work last month. Some 56% of college students nationally say they have used AI on assignments or exams, according to one fall survey. Credit: Jeff Bachner

Some professors say they're using the apps to help students study, draft outlines for projects and bring complicated concepts down to understandable terms.

But by no means is every college class immersed in AI The adoption is spotty and tends toward those courses that are writing-intensive, such as English and history, educators said.

Some 56% of college students nationally say they have used AI on assignments or exams, according to a survey in the fall by BestColleges, a website that ranks colleges. A similar survey by the group in March found only 22% said they used it.

ChatGPT, the most widely used and controversial chatbot, has gone viral, boasting some 180 million users and spurring creation of other bots such as Bard, Bing Chat and YouChat. 

Professors say they're still concerned about students cheating with AI, prompting some to rethink how they test and assess students. Schroeder, a professor in Molloy's School of Education and Human Services, said she has brought back the old "blue books" for more in-class writing assignments. She's also assigned writing topics that call on students to express more personal observations and experiences, reducing the chances of students using AI.

"When you assign that, you can better assess what they're doing," she said.

Schroeder said she sees great promise in AI's ability to help struggling students. She pointed to Khan Academy, an educational nonprofit that produces free video lessons and practice exercises on math, sciences, history and literature.

She's also using it to make herself a better teacher, she said.

"If there's a student who's disengaged, I'll ask the AI to role play the role of the student and come up with assignments that are interesting," she said.

She added, "Sometimes, if an assignment is confusing, I'll ask AI, 'Please think of yourself as a student in this class and tell me what you find confusing about this assignment.' And it can help me rethink an assignment." 

'Self control is necessary' with AI

Banarsi, the Stony Brook graduate student, said he recalled little, if any, discussion about ChatGPT in his classes in the spring semester. This past semester, he asked an instructor for help with creating some computer code and the teacher gave him permission to ask ChatGPT.

"ChatGPT created an outline and a framework for the code, and gave me some skeleton basics to work with," Banarsi, 23, said. "I worked off that to create my own code."

Nathan Cheong, a sophomore at Nassau Community College, said he doesn't use ChatGPT for schoolwork — he likes his writing to be his own — but he's found it valuable in personal research. For example, Cheong hopes to pursue a career in architecture and was interested in finding alternatives to concrete.

NCC student Nathan Cheong said that with ChatGPT, "I was able...

NCC student Nathan Cheong said that with ChatGPT, "I was able to understand something I didn't understand before." Credit: Newsday/Alejandra Villa Loarca

He could have typed the question into Google but he'd have to sift through a multitude of websites for information, he said. He could have gone to the library to do research but that would take even longer.

The chatbot spit out an answer right away. It saved time and it helped provide new ideas, he said. 

"I was able to understand something I didn't understand before," Cheong said. At the same time, he added, "Self control is necessary. There's a natural temptation to let it do everything for you. ... I didn't allow it to take control."

Students are learning lessons about ChatGPT, such as the bot's propensity to make things up or, as some in the tech world call it, "hallucinate."

"ChatGPT tries to come up with an answer even if it doesn't know. It's not always accurate," said Nistha Boghra, a junior journalism major at Stony Brook.

In her journalism class, Boghra said, the bot helped her pick topics for an assignment to write a story on solutions to some societal problems.

"I don't think I want to use it more than I am," she said. "I want to continue to challenge myself." 

Moving to the 'AI age'

Not all college professors are convinced their students should use AI.

Lizzie McCormick, an English professor at Suffolk County Community College, said she does not allow her students to use it in writing assignments.

"I don't think it's going to help them," McCormick said. "I've re-embraced the old-fashioned basics."

Nonetheless, she still has to deal with it. 

McCormick said the emergence of writing bots such as ChatGPT have encouraged instructors to create unique assignments that focus on lesser-known authors and uncommon questions. For example, she might ask students to craft an essay on a novel looking at the perspective of a minor character.

Educators pick up on a student's writing style, she said, and papers produced by AI tend to have a flat, generic style of composition. She said she's graded hundreds of essays since the chatbots emerged last year, and she's only seen a handful where she sees signs of AI plagiarism.

"I'll tell the student that an AI detector said there was a 60% chance that it is entirely AI And I'll ask them, 'What do you think about that? Do you have a draft or an outline?' "

A Molloy student tries out a virtual reality headset last month in...

A Molloy student tries out a virtual reality headset last month in a class on the use of AI. Credit: Jeff Bachner

Professors say they've learned their own lessons on the limitations of AI Several said they've worked with apps designed to identify AI-generated plagiarism, but they don't work very well. The problem is that an essay generated by, say, ChatGPT combines information on the web into original prose, making plagiarism difficult to spot.

The speed with which AI is moving into higher education reflects the pace it is moving into everyday lives, from receiving personal suggestions on films from Netflix to depositing a check by scanning it with your phone, educators say.

Thompson, the Stony Brook professor, said he sees society moving from "the digital age to the AI age."

The potential for AI to advance education is limitless, Schroeder said.

"You still need the human connection," she said. But, she added, "Some people are calling it the 'robot emancipator,' that it will finally help education overcome the obstacles we've faced for decades."

SUBSCRIBE

Unlimited Digital AccessOnly 25¢for 6 months

ACT NOWSALE ENDS SOON | CANCEL ANYTIME