• Starter AI
  • Posts
  • Cloud Next ‘24, Intel presents Gaudi 3, Symbolica’s structured learning

Cloud Next ‘24, Intel presents Gaudi 3, Symbolica’s structured learning

Google continues on the path of AI.

Hello, Starters!

From chips to APIs, tools to updates, the industry never rests, and neither do we. If you share our passion for AI, you will never get bored or stop learning new things.

Here’s what you’ll find today:

  • AI takes over Google’s Cloud Next 2024

  • Intel unveils its new Gaudi 3 AI chip

  • A symbolic approach to AI

  • Poe’s revenue model for creators

  • OpenAI improves GPT-4 Turbo’s API

  • And more.

Cloud Next 2024, Google's annual cloud keynote, started yesterday, unveiling several breakthroughs that highlight the company's continuous push into AI technology. Some of these breakthroughs include an AI video creation tool, a coding assistant, a text-to-image generator, and an AI agent builder.

Google continues to be one of the leaders in the AI race, nearly on par with Microsoft. These recent announcements make it clear that part of their goal is to drive further development through tools that facilitate new applications, especially by revealing a new Arm-based CPU to support most of its AI work.

Aiming to take over Nvidia's dominance, Intel has announced its latest AI accelerator, Gaudi 3, a chip developed to significantly decrease training times for various models, including Llama and GPT-3, with a 50% improvement compared to Nvidia's H100.

Gaudi 3 includes 96MB of onboard SRAM cache, 128 GB of HBM2e memory, and improved processing power while prioritising cost-effectiveness and scalability. The chip will be available in the second quarter of 2024.

Symbolica, an AI startup led by former Tesla engineer George Morgan, is breaking away from the traditional approach to AI models. Instead, they're aiming to establish scientific foundations that could lead to structured models excelling in accuracy.

The startup is harnessing "category theory," a branch of mathematics, to craft AI models with human-like reasoning capabilities, rather than relying on massive datasets, which often result in limitations and excessive use of computational resources.

💵Poe, a chatbot owned by Quora, is now enabling developers to generate revenue. With this new concept, bot creators can set a price per message for their Poe-powered bots, earning money each time a user messages them.

🤖Developers can now harness the power of an enhanced GPT-4 Turbo, as the API is generally available alongside the Vision model. OpenAI has shared the news through a post on X, showcasing great examples of how their model is being used to build novel applications.

What did you think of today's newsletter?

Login or Subscribe to participate in polls.

Thank you for reading!