Skip to Content

Future is Now - AI brings benefits to business world

By Tim Newton

Fabricio d’Almeida understands the fear.

A clinical assistant professor in finance, d’Almeida sees the positives and negatives with which the public regards Artificial Intelligence, or AI. The use of machine learning can bring about great gains in productivity, reduce human error and automate repetitive tasks. On the other hand, it carries security risks from hacking, raises ethical concerns about privacy, and lacks human creativity and empathy.

Fabricio d'Almeida
“It’s definitely here to stay, and we will need to learn as citizens how to regulate it and use it to society’s benefit.” - Fabricio d'Almedia

“There is pessimism and optimism around AI, similar to what humans experienced when we first were able to split the atom,” d’Almeida says. “But it’s definitely here to stay, and we will need to learn as citizens how to regulate it and use it to society’s benefit.”

The Mitchell E. Daniels, Jr. School of Business is keeping watch over AI through two different lenses. First, how will students and faculty use it in everyday class settings? Second, and perhaps more importantly, how can instructors teach students the usage of AI to make them more employable and valuable to their future organizations?

Cheating or efficiency?

The first question faculty and students have to consider is where the line should be drawn in the usage of software like ChatGPT, a large language model (LLM) that uses natural language processing to create humanlike dialogue. The fear among many is that similar to the old days of copying from encyclopedias or using the copy-paste function after the creation of the internet, students will use such software to write their homework assignments.

Cara Putman, clinical assistant professor in law, communication and ethics, says the key to finding the right solution is being proactive.

“Often we wait for technology to run ahead and then we try to catch up,” she says. “It’s a very reactive approach. We can’t wait until after the technology has been developed to decide what the ethical solution should be.”

blanchard-2.webp
“You can ask AI to write you a five-paragraph paper on a particular topic, but that’s not the assignment” - Kelly Blanchard

Several faculty members have had those discussions with students and have come to an understanding. Brian Chupp, clinical assistant professor in organizational behavior and human resources, says he encourages his students to use tools such as ChatGPT.

“I include in my syllabus that I’ll allow them to use it, but only 10 to 15 percent of their paper can come from it, and they have to cite it just like any other source,” Chupp says. “I regularly use ChatGPT to check the instructions for my assignments and sometimes change 20 percent of that activity based on the feedback I get, so it would be hypocritical of me to tell my students not to utilize it.”

Kelly Blanchard, clinical associate professor of economics and associate dean for undergraduate programs, says ChatGPT can also be a great brainstorming tool to come up with research project ideas. But students have to take the next step.

“You can ask AI to write you a five-paragraph paper on a particular topic, but that’s not the assignment,” Blanchard says. “The task is then for the student to improve on that. Find the evidence and documentation that supports and improves what has been generated.”

 

Rules & Regulations

Purdue provides guidance for instructors on the use of AI in teaching and learning.

The site includes items on the use of AI detection, copyright issues, FERPA and other privacy responsibilities, and tips on what to include in course syllabi.

A course syllabus should contain specific guidance on:

  • allowable use of AI tools by students for assessed work in the course,
  • appropriate use of AI tools by students during the learning process for non-assessed work,
  • how the instructional team may attempt to detect the use of AI in both situations above, and
  • potential consequences faced by students who violate AI-use policies.

The guidelines are updated each semester as technologies continued to evolve and expand. Soon, the university will develop similar guidance specifically for students.

A must-have skill

Regardless of where the lines of demarcation occur from class to class, there’s agreement from virtually all instructors on one fact: AI is here to stay, and students will need to master its use to be competitive in the marketplace.

“Employers are going to expect recent grads to be familiar with it,” says Matthew Lanham, clinical assistant professor in quantitative methods. “Let’s try to take advantage of these tools to make our life a little more efficient and easier.”

“Now accountants can spend more time working on more interesting areas and projects.” - Troy Janes

He points to the recent Data 4 Good case competition, a national contest that challenged students from Purdue and other universities to fuse data, AI technology, innovative process models and research-based methods to make good decisions. Students in the challenge used open AI tools to take physicians’ transcripts and fill out tedious medical forms, freeing the doctors to spend more time with patients.

Troy Janes, clinical professor in accounting, is teaching students in the Master of Science in Accounting program to use an AI-based auditing software package. It uses machine learning to help auditors do a more risk-based audit than the old random-sampling based audit.

Janes says AI has provided another benefit to the accounting profession, which is suffering from a national lack of students.

“By automating some of the more mundane tasks we have to perform, it can help make accounting a little more ‘sexy,’” he says. “I know that early in my career I was a pro at the copy machine. Now accountants can spend more time working on more interesting areas and projects.”

Human touch still needed

Jeffrey Hu, Accenture Professor of Information Technology, is a world-renowned expert on AI and business analytics. Students in his Master of Science in Business Analytics and Information Management classes have learned to use ChatGPT to code with Python in order to solve business problems.

Mohammad Rahman
“Those of us who teach at the intersection of business and technology embrace such challenges” - Mohammad Rahman

Automating the process has saved students a great deal of time and provided clear documentation. But Hu says AI users can’t completely rely on the mechanization.

“First, the technology isn’t perfect, so students may have to try several times to get the code to work correctly,” Hu says. “In addition, interpretation is not automatic and humans are still needed to translate the output of AI tools to business insights and decisions. The focus of my teaching is on which business problems can be solved by AI and how to apply the appropriate AI tools to solve those business problems.”

One upside of using AI tools is that students will have more time to do such deep thinking. Mohammad Rahman, Daniels School Chair, has focused much of his research on AI and decision making.

“We will have a productivity boost as a result of the use of the new technology. Of course, we will also have to skill up humans to be able to maximize their utility, but it’s no different than giving a plumber new tools or using machinery to plow fields instead of animals. Those of us who teach at the intersection of business and technology embrace such challenges,” Rahman says.

 

Do It Yourself

One Daniels School faculty member has developed his own AI tool for classroom instruction. John Burr, clinical assistant professor of strategic management, has teamed with alum Shrinivas Jandhyala (MSIA’02) to create CorpusKey, a set of Large Language Model (LLM) tools that helps instructors develop course material and measure student understanding.

Burr, who uses the product in his Consulting Tools and Skills class and others, says it’s a win-win for both teacher and student.

“For the instructor, it allows you to drop in a curated set of varied documents and produce a product that speaks in one voice at the appropriate level of understanding for that set of students,” he says. “From the student perspective, they aren’t forced into buying expensive textbooks and they have a more focused and concentrated product from which to learn.”

On the front end, instructors download the aggregate course material into CorpusKey. They can then select the type of course options to develop, with class materials ranging from slide decks to test banks to live cases. This allows a more personalized experience for their specific set of students.

The back-end processing relies on a set of intelligent LLM tools for specific tasks. These might include custom tokenizers for specific tasks in knowledge domain to sets of fine-tuned agents working on combinations to perform specific objectives. Customization and use of multiple LLMs, which use deep learning to understand how words and sentences function together, is more important when dealing with function- or domain-specific language.

In addition to the benefits of more personalized content, Burr says CorpusKey, and AI in general, can also give students more complete feedback. Practitioners in AI and education are still exploring how to leverage these tools to benefit students and understand how instructor processes will change as a result.

“I can use the software to grade both long- and short-form responses, giving me the opportunity to provide assessments faster and more frequently,” he says. He cautions, though, current models can give unexpected results and sound convincing, yet be off target. For example, instructors need to be sure that the feedback given fits into the boundaries of the original questions students are asked.

“Our goal with CorpusKey is provide faculty with the tools necessary to focus on high-value tasks. We believe we can offer more precise learning materials and assessments, and we’re trying to reshape the future of education,” he says.

 

Better teaching

As the use of AI tools grows, so too increases some unintended benefits. Professor d’Almeida says that ChatGPT can provide a transcript of his classes for student use, but he will find it useful as well.

“It’s our job to figure out how to adopt it and make sure our students are aware of its power and potential.” - Matthew Lanham

“The software will make a summary and generate multiple choice questions about the lecture. I can then check if what I want to get across as the main points for that day is happening. I don’t have to go back and watch an entire lecture to see if I was hitting the mark,” says d’Almeida, who also uses ChatGPT to help improve his e-mails and letters.

Technology rarely, if ever, runs in reverse. It’s clear that AI is here to stay and that faculty, students, and practitioners must learn to harness its power and potential.

“All my colleagues that I’ve talked to about this stuff have embraced it,” Professor Lanham says. “We’re not scared of it, but you have to use guardrails to evaluate what the students really know.

“It’s our job to figure out how to adopt it and make sure our students are aware of its power and potential.”

More Stories