Texas Universities Deploy AI Tools To Review How Courses Discuss Race And Gender The 74

Texas Universities Deploy AI Tools To Review How Courses Discuss Race And Gender The 74
Get stories like this delivered straight to your inbox.Sign up for The 74 Newsletter
A senior Texas A&M University System official testing a new artificial intelligence tool this fall asked it to find how many courses discuss feminism at one of its regional universities. Each time she asked in a slightly different way, she got a different number.
“Either the tool is learning from my previous queries,” Texas A&M system’s chief strategy officer Korry Castillo told colleagues in an email, “or we need to fine tune our requests to get the best results.”
It was Sept. 25, and Castillo was trying to deliver on a
“>▶ عرض المحتوى المضمّن
Chancellor Glenn Hegar and the Board of Regents had already made: to audit courses across all of the system’s 12 universities after conservative outrage over a gender-identity lesson at the flagship campus intensified earlier that month, leading to the professor’s firing and the university president’s resignation.
Texas A&M officials said the controversy stemmed from the course’s content not aligning with its description in the university’s course catalog and framed the audit as a way to ensure students knew what they were signing up for. As other public universities came under similar scrutiny and began preparing to comply with a new state law that gives governor-appointed regents authority over curricula, they, too, announced audits.
Records obtained by The Texas Tribune offer a first look at how Texas universities are experimenting with AI to conduct those reviews.
At Texas A&M, internal emails show staff are using AI software to search syllabi and course descriptions for words that could raise concerns under new system policies restricting how faculty teach about race and gender.
At Texas State, memos show administrators are suggesting faculty use an AI writing assistant to revise course descriptions. They urged professors to drop words such as “challenging,” “dismantling” and “decolonizing” and to rename courses with titles like “Combating Racism in Healthcare” to something university officials consider neutral like “Race and Public Health in America.”
Read Texas State University’s guide to faculty on how to review their curriculum with AI
While school officials describe the efforts as an innovative approach that fosters transparency and accountability, AI experts say these systems do not actually analyze or understand course content, instead generating answers that sound right based on patterns in their training data.
That means small changes in how a question is phrased can lead to different results, they said, making the systems unreliable for deciding whether a class matches its official description. They warned that using AI this way could lead to courses being flagged over isolated words and further shift control of teaching away from faculty and toward administrators.
“I’m not convinced this is about serving students or cleaning up syllabi,” said Chris Gilliard, co-director of the Critical Internet Studies Institute. “This looks like a project to control education and remove it from professors and put it into the hands of administrators and legislatures.”
Setting up the tool
During a board of regents meeting last month, Texas A&M System leaders described the new processes they were developing to audit courses as a repeatable enforcement mechanism.
Vice Chancellor for Academic Affairs James Hallmark said the system would use “AI-assisted tools” to examine course data under “consistent, evidence-based criteria,” which would guide future board action on courses. Regent Sam Torn praised it as “real governance,” saying Texas A&M was “stepping up first, setting the model that others will follow.”
That same day, the board approved new rules requiring presidents to sign off on any course that could be seen as advocating for “race and gender ideology” and prohibiting professors from teaching material not on the approved syllabus for a course.
In a statement to the Tribune, Chris Bryan, the system’s vice chancellor for marketing and communications, said Texas A&M is using OpenAI services through an existing subscription to aid the system’s course audit and that the tool is still being tested as universities finish sharing their course data. He said “any decisions about appropriateness, alignment with degree programs, or student outcomes will be made by people, not software.”
In records obtained by the Tribune, Castillo, the system’s chief strategy officer, told colleagues to prepare for about 20 system employees to use the tool to make hundreds of queries each semester.
The records also show some of the concerns that arose from early tests of the tool.
When Castillo told colleagues about the varying results she obtained when searching for classes that discuss feminism, deputy chief information officer Mark Schultz cautioned that the tool came with “an inherent risk of inaccuracy.”
“Some of that can be mitigated with training,” he said, “but it probably can’t be fully eliminated.”
Schultz did not specify what kinds of inaccuracies he meant. When asked if the potential inaccuracies had been resolved, Bryan said, “We are testing baseline conversations with the AI tool to validate the accuracy, relevance and repeatability of the prompts.” He said this includes seeing how the tool responds to invalid or misleading prompts and having humans review the results.
Experts said the different answers Castillo received when she rephrased her question reflect how these systems operate. They explained that these kinds of AI tools generate their responses by predicting patterns and generating strings of text.
“These systems are fundamentally systems for repeatedly answering the question ‘what is the likely next word’ and that’s it,” said Emily Bender, a computational linguist at the University of Washington. “The sequence of words that comes out looks like the kind of thing you would expect in that context, but it is not based on reason or understanding or looking at information.”
Because of that, small changes to how a question is phrased can produce different results. Experts also said users can nudge the model toward the answer they want. Gilliard said that is because these systems are also prone to what developers call “sycophancy,” meaning they try to agree with or please the user.
“Very often, a thing that happens when people use this technology is if you chide or correct the machine, it will say, ‘Oh, I’m sorry’ or like ‘you’re right,’ so you can often goad these systems into getting the answer you desire,” he said.
T. Philip Nichols, a Baylor University professor who studies how technology influences teaching and learning in schools, said keyword searches also provide little insight into how a topic is actually taught. He called the tool “a blunt instrument” that isn’t capable of understanding how certain discussions that the software might flag as unrelated to the course tie into broader class themes.
“Those pedagogical choices of an instructor might not be present in a syllabus, so to just feed that into a chatbot and say, ‘Is this topic mentioned?’ tells you nothing about how it’s talked about or in what way,” Nichols said.
Castillo’s description of her experience testing the AI tool was the only time in the records reviewed by the Tribune when Texas A&M administrators discussed specific search terms being used to inspect course content. In another email, Castillo said she would share search terms with staff in person or by phone rather than email.
System officials did not provide the list of search terms the system plans to use in the audit.
Martin Peterson, a Texas A&M philosophy professor who studies the ethics of technology, said faculty have not been asked to weigh in on the tool, including members of the university’s AI council. He noted that the council’s ethics and governance committee is charged with helping set standards for responsible AI use.
While Peterson generally opposes the push to audit the university system’s courses, he said he is “a little open to the idea that some such tool could perhaps be used.”
“It is just that we have to do our homework before we start using the tool,” Peterson said.
AI-assisted revisions
At Texas State University, officials ordered faculty to rewrite their syllabi and suggested they use AI to do it.
In October, administrators flagged 280 courses for review and told faculty to revise titles, descriptions and learning outcomes to remove wording the university said was not neutral. Records indicate that dozens of courses set to be offered by the College of Liberal Arts in the Spring 2026 semester were singled out for neutrality concerns. They included courses such as Intro to Diversity, Social Inequality, Freedom in America, Southwest in Film and Chinese-English Translation.
Faculty were given until Dec. 10 to complete the rewrites, with a second-level review scheduled in January and the entire catalog to be evaluated by June.
Administrators shared with faculty a guide outlining wording they said signaled advocacy. It discouraged learning outcomes that describe students “measure or require belief, attitude or activism (e.g., value diversity, embrace activism, commit to change).”
Administrators also provided a prompt for faculty to paste into an AI writing assistant alongside their materials. The prompt instructs the chatbot to “identify any language that signals advocacy, prescriptive conclusions, affective outcomes or ideological commitments” and generate three alternative versions that remove those elements.
Jayme Blaschke, assistant director of media relations at Texas State, described the internal review as “thorough” and “deliberative,” but would not say whether any classes have already been revised or removed, only that “measures are in place to guide students through any adjustments and keep their academic progress on track.” He also declined to explain how courses were initially flagged and who wrote the neutrality expectations.
Faculty say the changes have reshaped how curriculum decisions are made on campus.
Aimee Villarreal, an assistant professor of anthropology and president of Texas State’s American Association of University Professors chapter, said the process is usually faculty-driven and unfolds over a longer period of time. She believes the structure of this audit allows administrators to closely monitor how faculty describe their disciplines and steer how that material must be presented.
She said the requirement to revise courses quickly or risk having them removed from the spring schedule has created pressure to comply, which may have pushed some faculty toward using the AI writing assistant.
Villarreal said the process reflects a lack of trust in faculty and their field expertise when deciding what to teach.
“I love what I do,” Villarreal said, “and it’s very sad to see the core of what I do being undermined in this way.”
Nichols warned the trend of using AI in this way represents a larger threat.
“This is a kind of de-professionalizing of what we do in classrooms, where we’re narrowing the horizon of what’s possible,” he said. “And I think once we give that up, that’s like giving up the whole game. That’s the whole purpose of why universities exist.”
The Texas Tribune partners with Open Campus on higher education coverage.
Disclosure: Baylor University, Texas A&M University and Texas A&M University System have been financial supporters of The Texas Tribune, a nonprofit, nonpartisan news organization that is funded in part by donations from members, foundations and corporate sponsors. Financial supporters play no role in the Tribune’s journalism. Find a complete list of them here.
This article first appeared on The Texas Tribune.
Did you use this article in your work?
We’d love to hear how The 74’s reporting is helping educators, researchers, and policymakers.Tell us how
Disclaimer: This news article has been republished exactly as it appeared on its original source, without any modification.
We do not take any responsibility for its content, which remains solely the responsibility of the original publisher.
Author:Jessica Priest
Published on:2025-12-20 21:30:00
Source: www.the74million.org
!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod? n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n; n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=();t=b.createElement(e);t.async=!0; t.src=v;s=b.getElementsByTagName(e)(0);s.parentNode.insertBefore(t,s)}(window, document,’script’,’https://connect.facebook.net/en_US/fbevents.js’); fbq(‘init’, ‘626037510879173’); // 626037510879173 fbq(‘track’, ‘PageView’);{“@context”:”http://schema.org”,”@type”:”NewsArticle”,”dateCreated”:”2025-12-20T23:16:56+04:00″,”datePublished”:”2025-12-20T23:16:56+04:00″,”dateModified”:”2025-12-20T23:16:56+04:00″,”headline”:”Texas Universities Deploy AI Tools to Review How Courses Discuss Race and Gender The 74″,”name”:”Texas Universities Deploy AI Tools to Review How Courses Discuss Race and Gender The 74″,”keywords”:[],”url”:”https://uaetodaynews.com/texas-universities-deploy-ai-tools-to-review-how-courses-discuss-race-and-gender-the-74/”,”description”:”Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter A senior Texas A&M University System official testing a new artificial intelligence tool this fall asked it to fin”,”copyrightYear”:”2025″,”articleSection”:”Education”,”articleBody”:”nnn n Get stories like this delivered straight to your inbox. Sign up for The 74 Newslettern n n n nA senior Texas A&M University System official testing a new artificial intelligence tool this fall asked it to find how many courses discuss feminism at one of its regional universities. Each time she asked in a slightly different way, she got a different number.nnnnu201cEither the tool is learning from my previous queries,u201d Texas A&M systemu2019s chief strategy officer Korry Castillo told colleagues in an email, u201cor we need to fine tune our requests to get the best results.u201dnnnnIt was Sept. 25, and Castillo was trying to deliver on a promise Chancellor Glenn Hegar and the Board of Regents had already made: to audit courses across all of the systemu2019s 12 universities after conservative outrage over a gender-identity lesson at the flagship campus intensified earlier that month, leading to the professoru2019s firing and the university presidentu2019s resignation. nnnnnnnnTexas A&M officials said the controversy stemmed from the courseu2019s content not aligning with its description in the universityu2019s course catalog and framed the audit as a way to ensure students knew what they were signing up for. As other public universities came under similar scrutiny and began preparing to comply with a new state law that gives governor-appointed regents more authority over curricula, they, too, announced audits.nnnnRecords obtained by The Texas Tribune offer a first look at how Texas universities are experimenting with AI to conduct those reviews. nnnnAt Texas A&M, internal emails show staff are using AI software to search syllabi and course descriptions for words that could raise concerns under new system policies restricting how faculty teach about race and gender. nnnnAt Texas State, memos show administrators are suggesting faculty use an AI writing assistant to revise course descriptions. They urged professors to drop words such as u201cchallenging,u201d u201cdismantlingu201d and u201cdecolonizingu201d and to rename courses with titles like u201cCombating Racism in Healthcareu201d to something university officials consider more neutral like u201cRace and Public Health in America.u201dnnnnRead Texas State Universityu2019s guide to faculty on how to review their curriculum with AInnnnWhile school officials describe the efforts as an innovative approach that fosters transparency and accountability, AI experts say these systems do not actually analyze or understand course content, instead generating answers that sound right based on patterns in their training data.nnnnThat means small changes in how a question is phrased can lead to different results, they said, making the systems unreliable for deciding whether a class matches its official description. They warned that using AI this way could lead to courses being flagged over isolated words and further shift control of teaching away from faculty and toward administrators.nnnnu201cIu2019m not convinced this is about serving students or cleaning up syllabi,u201d said Chris Gilliard, co-director of the Critical Internet Studies Institute. u201cThis looks like a project to control education and remove it from professors and put it into the hands of administrators and legislatures.u201dnnnnSetting up the toolnnnnDuring a board of regents meeting last month, Texas A&M System leaders described the new processes they were developing to audit courses as a repeatable enforcement mechanism. nnnnVice Chancellor for Academic Affairs James Hallmark said the system would use u201cAI-assisted toolsu201d to examine course data under u201cconsistent, evidence-based criteria,u201d which would guide future board action on courses. Regent Sam Torn praised it as u201creal governance,u201d saying Texas A&M was u201cstepping up first, setting the model that others will follow.u201d nnnnThat same day, the board approved new rules requiring presidents to sign off on any course that could be seen as advocating for u201crace and gender ideologyu201d and prohibiting professors from teaching material not on the approved syllabus for a course.nnnnIn a statement to the Tribune, Chris Bryan, the systemu2019s vice chancellor for marketing and communications, said Texas A&M is using OpenAI services through an existing subscription to aid the systemu2019s course audit and that the tool is still being tested as universities finish sharing their course data. He said u201cany decisions about appropriateness, alignment with degree programs, or student outcomes will be made by people, not software.u201dnnnnIn records obtained by the Tribune, Castillo, the systemu2019s chief strategy officer, told colleagues to prepare for about 20 system employees to use the tool to make hundreds of queries each semester. nnnnThe records also show some of the concerns that arose from early tests of the tool. nnnnWhen Castillo told colleagues about the varying results she obtained when searching for classes that discuss feminism, deputy chief information officer Mark Schultz cautioned that the tool came with u201can inherent risk of inaccuracy.u201dnnnnu201cSome of that can be mitigated with training,u201d he said, u201cbut it probably canu2019t be fully eliminated.u201dnnnnSchultz did not specify what kinds of inaccuracies he meant. When asked if the potential inaccuracies had been resolved, Bryan said, u201cWe are testing baseline conversations with the AI tool to validate the accuracy, relevance and repeatability of the prompts.u201d He said this includes seeing how the tool responds to invalid or misleading prompts and having humans review the results.nnnnExperts said the different answers Castillo received when she rephrased her question reflect how these systems operate. They explained that these kinds of AI tools generate their responses by predicting patterns and generating strings of text.nnnnu201cThese systems are fundamentally systems for repeatedly answering the question u2018what is the likely next wordu2019 and thatu2019s it,u201d said Emily Bender, a computational linguist at the University of Washington. u201cThe sequence of words that comes out looks like the kind of thing you would expect in that context, but it is not based on reason or understanding or looking at information.u201dnnnnBecause of that, small changes to how a question is phrased can produce different results.u00a0Experts also said users can nudge the model toward the answer they want. Gilliard said that is because these systems are also prone to what developers call u201csycophancy,u201d meaning they try to agree with or please the user. nnnnu201cVery often, a thing that happens when people use this technology is if you chide or correct the machine, it will say, u2018Oh, Iu2019m sorryu2019 or like u2018youu2019re right,u2019 so you can often goad these systems into getting the answer you desire,u201d he said.nnnnT. Philip Nichols, a Baylor University professor who studies how technology influences teaching and learning in schools, said keyword searches also provide little insight into how a topic is actually taught. He called the tool u201ca blunt instrumentu201d that isnu2019t capable of understanding how certain discussions that the software might flag as unrelated to the course tie into broader class themes. nnnnu201cThose pedagogical choices of an instructor might not be present in a syllabus, so to just feed that into a chatbot and say, u2018Is this topic mentioned?u2019 tells you nothing about how itu2019s talked about or in what way,u201d Nichols said. nnnnCastillou2019s description of her experience testing the AI tool was the only time in the records reviewed by the Tribune when Texas A&M administrators discussed specific search terms being used to inspect course content. In another email, Castillo said she would share search terms with staff in person or by phone rather than email. nnnnSystem officials did not provide the list of search terms the system plans to use in the audit.nnnnMartin Peterson, a Texas A&M philosophy professor who studies the ethics of technology, said faculty have not been asked to weigh in on the tool, including members of the universityu2019s AI council. He noted that the councilu2019s ethics and governance committee is charged with helping set standards for responsible AI use.nnnnWhile Peterson generally opposes the push to audit the university systemu2019s courses, he said he is u201ca little more open to the idea that some such tool could perhaps be used.u201dnnnnu201cIt is just that we have to do our homework before we start using the tool,u201d Peterson said.nnnnAI-assisted revisionsnnnnAt Texas State University, officials ordered faculty to rewrite their syllabi and suggested they use AI to do it.nnnnIn October, administrators flagged 280 courses for review and told faculty to revise titles, descriptions and learning outcomes to remove wording the university said was not neutral. Records indicate that dozens of courses set to be offered by the College of Liberal Arts in the Spring 2026 semester were singled out for neutrality concerns. They included courses such as Intro to Diversity, Social Inequality, Freedom in America, Southwest in Film and Chinese-English Translation.nnnnFaculty were given until Dec. 10 to complete the rewrites, with a second-level review scheduled in January and the entire catalog to be evaluated by June. nnnnAdministrators shared with faculty a guide outlining wording they said signaled advocacy. It discouraged learning outcomes that describe students u201cmeasure or require belief, attitude or activism (e.g., value diversity, embrace activism, commit to change).u201dnnnnAdministrators also provided a prompt for faculty to paste into an AI writing assistant alongside their materials. The prompt instructs the chatbot to u201cidentify any language that signals advocacy, prescriptive conclusions, affective outcomes or ideological commitmentsu201d and generate three alternative versions that remove those elements. nnnnJayme Blaschke, assistant director of media relations at Texas State, described the internal review as u201cthoroughu201d and u201cdeliberative,u201d but would not say whether any classes have already been revised or removed, only that u201cmeasures are in place to guide students through any adjustments and keep their academic progress on track.u201d He also declined to explain how courses were initially flagged and who wrote the neutrality expectations.nnnnFaculty say the changes have reshaped how curriculum decisions are made on campus.nnnnAimee Villarreal, an assistant professor of anthropology and president of Texas Stateu2019s American Association of University Professors chapter, said the process is usually faculty-driven and unfolds over a longer period of time. She believes the structure of this audit allows administrators to more closely monitor how faculty describe their disciplines and steer how that material must be presented.nnnnShe said the requirement to revise courses quickly or risk having them removed from the spring schedule has created pressure to comply, which may have pushed some faculty toward using the AI writing assistant.nnnnVillarreal said the process reflects a lack of trust in faculty and their field expertise when deciding what to teach.nnnnu201cI love what I do,u201d Villarreal said, u201cand itu2019s very sad to see the core of what I do being undermined in this way.u201dnnnnNichols warned the trend of using AI in this way represents a larger threat. nnnnu201cThis is a kind of de-professionalizing of what we do in classrooms, where weu2019re narrowing the horizon of whatu2019s possible,u201d he said. u201cAnd I think once we give that up, thatu2019s like giving up the whole game. Thatu2019s the whole purpose of why universities exist.u201dnnnnThe Texas Tribune partners with Open Campus on higher education coverage.nnnnDisclosure: Baylor University, Texas A&M University and Texas A&M University System have been financial supporters of The Texas Tribune, a nonprofit, nonpartisan news organization that is funded in part by donations from members, foundations and corporate sponsors. Financial supporters play no role in the Tribuneu2019s journalism. Find a complete list of them here.nnnnThis article first appeared on The Texas Tribune.nnnn nn n n n Did you use this article in your work?
nWeu2019d love to hear how The 74u2019s reporting is helping educators, researchers, and policymakers. Tell us hown n nnn !function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod?n n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n;n n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=();t=b.createElement(e);t.async=!0;n t.src=v;s=b.getElementsByTagName(e)(0);s.parentNode.insertBefore(t,s)}(window,n document,’script’,’https://connect.facebook.net/en_US/fbevents.js’);n fbq(‘init’, ‘626037510879173’); // 626037510879173n fbq(‘track’, ‘PageView’);n nnnnnDisclaimer: This news article has been republished exactly as it appeared on its original source, without any modification. nWe do not take any responsibility for its content, which remains solely the responsibility of the original publisher.nnnnnnAuthor: Jessica PriestnPublished on: 2025-12-20 21:30:00nSource: www.the74million.orgn”,”publisher”:{“@id”:”#Publisher”,”@type”:”Organization”,”name”:”uaetodaynews”,”logo”:{“@type”:”ImageObject”,”url”:”https://uaetodaynews.com/wp-content/uploads/2025/09/images-e1759081190269.png”},”sameAs”:[“https://www.facebook.com/uaetodaynewscom”,”https://www.pinterest.com/uaetodaynews/”,”https://www.instagram.com/uaetoday_news_com/”]},”sourceOrganization”:{“@id”:”#Publisher”},”copyrightHolder”:{“@id”:”#Publisher”},”mainEntityOfPage”:{“@type”:”WebPage”,”@id”:”https://uaetodaynews.com/texas-universities-deploy-ai-tools-to-review-how-courses-discuss-race-and-gender-the-74/”,”breadcrumb”:{“@id”:”#Breadcrumb”}},”author”:{“@type”:”Person”,”name”:”uaetodaynews”,”url”:”https://uaetodaynews.com/author/arabsongmedia-net/”},”image”:{“@type”:”ImageObject”,”url”:”https://uaetodaynews.com/wp-content/uploads/2025/12/ai-texas-courses-race-and-gender-825×494.png”,”width”:1200,”height”:494}}
Disclaimer: This news article has been republished exactly as it appeared on its original source, without any modification.
We do not take any responsibility for its content, which remains solely the responsibility of the original publisher.
Author: uaetodaynews
Published on: 2025-12-20 19:16:00
Source: uaetodaynews.com




