Amazon Makes Bedrock, a service developed by Amazon that provides users with a selection of generative AI models developed either by Amazon or by third-party partners by means of an application programming interface (API), became generally available today, as was reported by the company.
Bedrock, which was introduced at the beginning of April, gives AWS customers the ability to construct applications on top of generative AI models and personalize those applications with their own proprietary data.
Using these models, companies and developers can also create artificial intelligence “agents” that can autonomously carry out tasks such as booking trips, maintaining inventories, and processing insurance claims.
Amazon has announced that in the upcoming weeks, the open source large language model Llama 2 from Meta will be added to Bedrock. Llama 2 will join models from AI21 Labs, Anthropic, Cohere, and Stability AI.
Amazon makes Bedrock available will be the first “fully managed generative AI service” to provide Llama 2, more precisely the 13-billion-parameter and 70-billion-parameter variants of the algorithm. (The ability of a model to perform a task, such as creating text, is essentially defined by the parameters of the model, which are the components of a model that are learned from previously collected training data.) On the other hand, it is important to point out that Llama 2 has been accessible on various cloud-hosted generative AI platforms for some time now. One example of this is Google’s Vertex AI.
Bedrock is, in many respects, analogous to Vertex AI, which provides its customers with their very own library of finely configurable first- and third-party models on which they can build generative AI applications. Bedrock, on the other hand, is an open-source platform. But Swami Sivasubramanian, AWS’s vice president of data and artificial intelligence, contends that Bedrock has an advantage since it integrates well with other AWS services, such as AWS PrivateLink, which is used to create a safe link between Bedrock and the virtual private cloud of a particular business.
To be fair to Google, I would argue that that is more of a perceived advantage than an objective one, seeing as how it is depending on the customer in question and the cloud infrastructure they are employing. Naturally, you won’t hear Sivasubramanian admit that fact at any point.
According to Sivasubramanian, who was quoted in a news release, “over the past year, the proliferation of data, access to scalable compute, and advancements in machine learning have led to a surge of interest in generative AI.” This has sparked new concepts that have the potential to alter entire industries and reinvent how work is carried out. “The announcement that was made today is a major milestone that puts generative AI within reach of every business, from startups to enterprises, and every employee, from developers to data analysts,” said the company.
In related news, this morning Amazon announced the introduction of its Titan Embeddings model, which is a first-party model that translates text to numerical representations called embeddings to fuel search and personalization applications.
Read Also;Why Amazon Put Services In The Spotlight For Prime Day
On par with OpenAI’s most recent embeddings model, the Titan Embeddings model supports approximately 25 languages and chunks of text — or full documents — up to 8,192 tokens (equal to approximately 6,000 words) in length.
The beginning of bedrock was not easy. Bloomberg published an article in May stating that six weeks after Amazon demonstrated the technology with an extremely vague presser and only one testimonial, the majority of cloud customers still did not have access.
Amazon is definitely seeking to make waves in the expanding and profitable market for generative AI, as seen by the announcements made today as well as the recent investment of many billions of dollars in the AI firm Anthropic by Amazon.
Follow our socials Whatsapp, Facebook, Instagram, Twitter, and Google News.