Amazon Web Services (AWS) has announced the launch of Amazon Bedrock and Amazon Titan foundational models (FMs), making it easier for customers to build and scale generative AI applications with foundation models. According to AWS, they received feedback from their select customers that there are a few big things standing in their way today in relation to different AI use cases. First, they need a straightforward way to find and access high-performing FMs that give outstanding results and are best-suited for their purposes. Second, customers want integration into applications to be seamless, without having to manage huge clusters of infrastructure or incur large costs. Finally, customers want it to be easy to take the base FM and build differentiated apps using their own data.
To address these concerns, AWS has announced Amazon Bedrock, a new service that makes FMs from AI21 Labs, Anthropic, Stability AI, and Amazon accessible via an API. Bedrock is the easiest way for customers to build and scale generative AI-based applications using FMs, democratizing access for all builders. Bedrock offers the ability to access a range of powerful FMs for text and images—including Amazon’s Titan FMs, which consist of two new large language models (LLMs)—through a scalable, reliable, and secure AWS managed service.
Customers can choose from some of the most cutting-edge FMs available today, including the Jurassic-2 family of multilingual LLMs from AI21 Labs, which follow natural language instructions to generate text in multiple languages. Claude, Anthropic’s LLM, can perform a wide variety of conversational and text processing tasks, and Bedrock also makes it easy to access Stability AI’s suite of text-to-image foundation models, including Stable Diffusion, which is capable of generating unique, realistic, high-quality images, art, logos, and designs.
One of the most important capabilities of Bedrock is the ease of customizing a model. You can fine-tune the model for a particular task without having to annotate large volumes of data. All you need is a small dataset as few as 20 examples.
Bedrock is now in limited preview, and AWS team suggested that receiving positive feedback from customers. Shishir Mehrotra, Co-founder and CEO of Coda, says, “As a longtime happy AWS customer, we’re excited about how Amazon Bedrock can bring quality, scalability, and performance to Coda AI.”
AWS has also announced the preview of Amazon Titan FMs, which are built to detect and remove harmful content in the data, reject inappropriate content in the user input, and filter the models’ outputs that contain inappropriate content. Titan FMs consist of two new LLMs such as the following:
- A generative LLM for tasks such as summarization, text generation, classification, open-ended Q&A, and information extraction
- An embeddings LLM that translates text inputs into numerical representations that contain the semantic meaning of the text.
Bedrock makes the power of FMs accessible to companies of all sizes so that they can accelerate the use of machine learning (ML) across their organizations and build their own generative AI applications because it will be easy for all developers. AWS believes that Bedrock will be a massive step forward in democratizing FMs, and their partners like Accenture, Deloitte, Infosys, and Slalom are building practices to help enterprises go faster with generative AI. Independent Software Vendors (ISVs) like C3 AI and Pega are excited to leverage Bedrock for easy access to its great selection of FMs with all of the security, privacy, and reliability they expect from AWS.
- Agentic Reasoning Design Patterns in AI: Examples - October 18, 2024
- LLMs for Adaptive Learning & Personalized Education - October 8, 2024
- Sparse Mixture of Experts (MoE) Models: Examples - October 6, 2024
I found it very helpful. However the differences are not too understandable for me