Two weeks ago, we shifted our survey of language AI to the view from CSOFT Health Sciences’ booth at DIA 2023, where our team discovered an event floor teeming with uses for large language models just half a year after the release of ChatGPT. With more medical communications now reflecting the work of generative AI, the ‘what next’ of these models in the life sciences is not one of imagined possibilities and potential, but actual work in practice. That makes news this week that Apple is finally moving to develop its own flight of language models – as Meta, Google, and Microsoft have all done – a unique case in point from tech at large. Far from late to the party, the leader in everyday consumer health tech may be about to connect users with the communicative power of “Apple GPT” at a critical moment for AI’s adoption in areas of life where its technology is already robust, and appears to be taking the time to ensure it does so correctly.
By Slator’s latest account of language AI, Apple’s overture comes at a moment when language models are “mushrooming as barriers to entry collapse.” Noting Synchronity Labs’ announcement of an instant, multilingual dubbing capabilities that gave both Mark Zuckerberg and podcast producer Lex Friedman their own voices in Hindi, the language industry commentor is rightly asking what a real linguistic tool is, as opposed to “a thin layer on top of ChatGPT.” What makes Apple’s efforts a likely case of the former is that its AI would connect an extensive interlay of software and hardware products with a large existing user base within Apple’s software ecosystem. Just consider the immense amount of wearable health data Apple Watch users generate, as well as the trust users exhibit toward it as a tool for tending to their wellbeing, and you have a prime foundation for making its insights more actionable and engaging through communicative features and back-end data analytics.
So far, Apple appears to be focused on developing tools for its own employees that it would then evaluate for public release. Beyond interactive wearables, one observer speculates the customer support AppleCare extends to its users on a worldwide basis could soon become equipped with an AI-powered chatbot capable of understanding and responding to customer queries in multiple languages. Like language barriers, there is hope that better AI can help with general dead ends in customer communications, enabling personalized and timely support in any region. The chatbot’s ability to access Apple’s knowledge base in a conversational manner would likely enhance the accessibility and effectiveness of support content. Still, all of this could potentially hold true for Apple’s rivals in Silicon Valley as they develop their proprietary models. What makes Apple’s case special is the potential for its product development to support the real mandates companies in the life sciences are seeking to address with AI, in terms of driving user friendly solutions that truly enhance people’s health-related experiences and help ensure quality in the vast amounts of data researchers, clinicians, and patients themselves interact with.
As technology advances how people interact with devices, multilingual communications remain a pressing need to communicate and deliver the latest innovations to users in global markets. With a worldwide network of translators in 250+ languages and technology-driven localization solutions, CSOFT can help businesses across the life sciences meet their communication needs on a global basis. To learn about how our multilingual solutions help innovators accelerate their journey to market, visit lifesciences.csoftintl.com.[dqr_code size="120" bgcolor="#fff"]