Visitors will have to put their Apple devices in a Faraday cage
A hot potato: It's no secret that Elon Musk really isn't a fan of OpenAI – as evidenced by his lawsuit against the company. Apple's partnership with the ChatGPT maker seems to have riled the billionaire even more, to the point where he has threatened to ban iPhones from all his companies.
Editor's take: As much as it would make sense for Nvidia to focus solely on being the leading AI silicon vendor, their rise to power has left them with little choice but to continue pushing forward in areas that make some of their big customers uncomfortable.
Forward-looking: OpenAI just introduced GPT-4o (GPT-4 Omni or "O" for short). The model is no "smarter" than GPT-4 but still some remarkable innovations set it apart: the ability to process text, visual, and audio data simultaneously, almost no latency between asking and answering, and an unbelievably human-sounding voice.
A hot potato: GPT-4 stands as the newest multimodal large language model (LLM) crafted by OpenAI. This foundational model, currently accessible to customers as part of the paid ChatGPT Plus line, exhibits notable prowess in identifying security vulnerabilities without requiring external human assistance.
WTF?! Arm's CEO has sounded a warning bell about the energy requirements needed to advance AI algorithms. He cautions that in a few years, "AI data centers" could require so much electricity that it could jeopardize the US power grid.
Hackers could deploy the worms in plain text emails or hidden in images
In context: Big Tech continues to recklessly shovel billions of dollars into bringing AI assistants to consumers. Microsoft's Copilot, Google's Bard, Amazon's Alexa, and Meta's Chatbot already have generative AI engines. Apple is one of the few that seems to be taking its time upgrading Siri to an LLM and hopes to compete with an LLM that runs locally rather than in the cloud.