helping children learn to read sounds like an ideal use case for an LLM. An app that utilizes its own users interactions to enhance its own capabilities is not inherently malicious and is vastly different from selling user data to third parties or training on scraped content from others.
And what are you even talking about with the “children could face disciplinary or legal consequences for noncompliance” nonsense. where was that in the article?
Do you think the department of education writes the textbooks, standardized tests (SAT, ACT, etc.), grading and student management software, learning management systems (Google Classroom, Canvas), or manufactures its own classroom tech (Chromebooks, tablets)? The education system is full of for-profit businesses that can jack up the prices, and they do. The DOE simply doesn’t have the resources to create these things themselves and would cost them far more if they tried. The only new thing here is the AI, the business model has existed forever
Personally, I’m more concerned with the use of Google products in schools. A company that’s sole business is harvesting user data and selling it to advertisers should have no place in schools or children’s products. But they’ve embedded themselves into everything so people just accept it at the cost of privacy