Google's Gemini unveils major updates: Faster and lighter Gemini 1.5 Flash AI model out now

Google's Gemini unveils major updates: Faster and lighter Gemini 1.5 Flash AI model out now

Both Gemini 1.5 Pro and 1.5 Flash are now available in public preview, offering a 1 million token context window on Google AI Studio and Vertex AI.

Google Gemini
Pranav Dixit
  • May 14, 2024,
  • Updated May 15, 2024, 1:33 AM IST

Google's Gemini project is breaking new ground in the world of artificial intelligence (AI) with a series of updates that promise faster performance, extended context capabilities, and the vision for future AI assistants.

Led by Demis Hassabis, CEO of Google DeepMind, the Gemini team has been at the forefront of AI innovation. In December, they introduced the first natively multimodal model, Gemini 1.0, in various sizes. Just months later, they rolled out Gemini 1.5 Pro, boasting enhanced performance and a groundbreaking long context window of 1 million tokens.

Related Articles

The response from developers and enterprise customers has been overwhelmingly positive, with users lauding the model's long context window, multimodal reasoning abilities, and overall performance.

Responding to user feedback for applications requiring lower latency and cost efficiency, Google is introducing Gemini 1.5 Flash. This lightweight model is designed for speed and efficiency at scale, offering a faster alternative to its predecessor, Gemini 1.5 Pro.

The Gemini 1.5 Flash is available in a 1 million context window, while the Gemini 1.5 Pro has an expanded 2 million context window. Gemini 1.5 Flash in the 2M context window does not have a waitlist at this time.  Furthermore, Gemini 1.5 Pro is being integrated into various Google products, including Gemini Advanced and Workspace apps.

In the realm of natural language processing (NLP) and machine learning, a context window refers to a specified range of surrounding words or tokens considered when processing a particular word or token within a text sequence.

The context window helps algorithms understand the meaning of a word or token by taking into account the words or tokens that appear nearby. This context is crucial for tasks such as word embeddings, where words are represented as dense vectors based on their contextual usage in a large corpus of text.

For example, in a context window of size 3, if we're processing the word "dog" in the sentence "The quick brown fox jumps over the lazy dog," the context window would include the words "the quick brown" before "dog" and "jumps over the lazy" after "dog." The context window allows the algorithm to capture the semantic meaning of "dog" based on its surrounding words.

Additionally, Google unveiled Gemma 2.0, the next generation of open models aimed at fostering responsible AI innovation. Gemma 2.0 boasts breakthrough performance and efficiency, with new sizes and capabilities.

Project Astra represents Google's vision for the future of AI assistants, aimed at developing universal agents capable of understanding and responding to complex real-world scenarios. Leveraging advancements in multimodal information processing, Google aims to create agents that can respond quickly and naturally, akin to human interaction.

Read more!
RECOMMENDED