Dr. Roger J. Huston is the founder and principal consultant of Faith Works Consulting—a management consulting and leadership coaching company. Over the past two decades, he has served in educational leadership positions, including as the Dean of the School of Business and Leadership at Kentucky Christian University and as the Director of a nonprofit educational organization. Roger enjoys working with foundations, human service nonprofits, social enterprises, and higher education institutions on administrative and strategic issues. He has taught undergraduate and graduate courses at several colleges and universities in business, computer science, economics, entrepreneurship, finance, leadership, management, marketing, political science, and public policy. Roger graduated Summa Cum Laude with a B.A. with Honors in Religion from Simpson College. He then earned a Master of Arts in Religion from Yale University. Roger also holds a Master of Public Administration from Iowa State University and a Ph.D. in Public Policy from the University of Delaware.
The release of ChatGPT by OpenAI last fall has created a frenzy in higher education. Faculty are concerned about their students using AI-generated text for writing assignments and thus the likelihood of plagiarizing material that is not their own. Although some tools are being developed to combat the potential rise of plagiarism through AI-generated text, fears persist that the higher education landscape of learning could be altered forever. Those fears may be accurate, but perhaps not for some of the current reasons projected in the media. Artificial Intelligence’s use is only as good as its dataset. Thus far, after being prompted, a computer basically replicates “new” formulations based on past results. So, suppose you ask a machine learning application like an AI chatbot to take a test or produce an essay on known topics. In that case, it is naturally going to be able to accomplish that task to varying or lesser degrees, albeit in a slightly robotic, simple, or broad manner. AI’s use—at this point— is less about the performance and more about the data, but that is about to change, and higher education needs to be ready.
In an interview on Yahoo Finance, Cathy Wood, founder of the ETF ARK Innovation, astutely explained that, essentially, the realized value in AI—from a monetization perspective—could be how well organizations use their data to expand innovation opportunities and improve their performance. She gave the example of Tesla and the immense amount of driving data it has generated over the last several years. Computers can use all that data to predict future probabilities to enhance real-time performance. For example, just one small data set might be about how long it takes on average to stop a mid-sized sedan traveling at 60 mph in a smooth fashion when driving on the interstate in the rain. A computer can analyze that data across millions of recorded events and then tell itself that it should allot 5.5 seconds to descend to a complete stop. While the stopping of the vehicle in itself is impressive, it’s the data that made it possible. If it does not have the data, a vehicle’s computer may try to slow itself in the rain in 8 seconds which may feel tediously slow to the passengers, or 3 seconds which may be a screeching halt which could also cause tire slippage and the vehicle instability leading to an accident. Yet, because Tesla has already been accumulating this data for years, it will be competitively positioned in the future in creating autonomous vehicles.
So, what does this technological advancement in machine learning mean for AI’s benefit to higher education? AI for generating text is only the tip of the iceberg for higher education. The economic and sociocultural impact of AI lies in its ability to access and analyze data and then create value for its users based on the data. Similar to Tesla, an opportunity now exists in higher education where institutions could conceivably roll out new technology to assist humans and generate greater performance and better outcomes in a variety of areas. For example, admissions programs have already utilized chatbots to respond to prospective student questions. However, the release of ChatGPT has shown that the time is now for institutions to strategize about how, not if, they will use AI in the future. Higher education must begin focusing on its applications in all areas that could lead to greater efficiency in organizational processes, better outcomes, and increased worker productivity. Individualized personnel data and institutional datasets are where the future lies for AI in the higher education ecosystem.
Consider the areas of higher education and the vast amount of data they produce, such as in academic affairs, advancement, athletics, business operations, communications, enrollment, finances, human resources, marketing, and student development. If you are the leader of these areas or a Cabinet or Board of Trustee member, you should be thinking and asking questions about how your own institutional research and data can be utilized through AI to grow and improve these areas. For example, you might ask, what are our prospective student personae, and how can we improve our targeted messages so that they enroll with us? Where are our shortcomings regarding missed opportunities with our alumni and current student engagement, and where can we facilitate networking and learning experiences between the two groups? How can we better forecast budget needs for two, three, or even five years in advance based on existing models and revenue projections, and when should we be making interventions to change course?
Likewise, as an individual, the power of machine learning and AI will undoubtedly make your tasks easier and much less time-consuming in the future. Say I am applying for a job, and human resources ask me questions about my past experiences and accomplishments and how those relate to the job I have applied for at the institution. I do not want AI to generate nonspecific text based on its datasets gathered across the internet. Instead, I want AI to read, analyze, and harness my individual data to re-create answers specific to me and the job for which I am applying. I want AI to use my writing files—its style, content, strengths, weaknesses, and emotional intelligence, and highlight my achievements—in response to questions by hiring committees. By utilizing my own work and thoughts, AI could drastically cut down on time-consuming tasks and improve communication clarity and precision.
While privacy and legal issues will be an ongoing concern as the use of AI expands, ultimately, the winners of the next generation of AI use in higher education will be those who can actively maintain and cultivate their datasets for increased human performance and educational outcomes. The value of AI will be in decreasing human time and effort and increasing results based on data and predictive analytics. As such, for higher education, the third-party vendors that provide services in the areas mentioned above are likely to monetarily capitalize the most from AI because they have and maintain datasets for numerous institutions. However, higher education leaders can begin strategically planning for this eventuality so they can be ready to lean into the myriad of ways AI can make their work easier, with more efficiency and greater effectiveness. The result of these hybrid functions between humans and AI can lead to exciting possibilities in higher education. For example, suppose AI can help bring additional students to your campus, find more ideal employees for your institutional mission and values, build better relationships with alumni and the surrounding community, decrease workloads for faculty and staff, increase student engagement and experiential learning opportunities, and simplify the budgeting process. Should not the arrival of AI be celebrated instead of feared?