Amazon quickly incorporated DeepSeek AI models into its Bedrock platform due to high customer demand in January. The company’s rapid response to the Chinese AI model’s success underscores the swift pace of innovation and competition within the tech industry.
Rapid Integration and Customer Demand
In late January, as DeepSeek’s technology gained industry attention, Amazon experienced a surge in requests from businesses seeking access to the model through its Bedrock development tool. According to sources, the approval process for integrating DeepSeek into Bedrock was unusually fast. Amazon CEO Andy Jassy later acknowledged the company’s quick action to meet customer needs.
Internal Reactions and Strategic Shifts
DeepSeek’s swift rise has triggered significant internal reactions across Amazon. According to internal documents and sources familiar with the matter, these reactions have influenced product updates, sales strategies, and development efforts. Amazon’s competitors, including OpenAI, Google, Meta, and Microsoft, have also responded to DeepSeek’s impact.
An Amazon spokesperson stated that the company’s strategy focuses on “providing secure access to the latest models through AWS, giving customers control over their data to customize and build generative AI applications.” The spokesperson added that “Delivering DeepSeek models is an example of that.”
Concerns and Competitive Positioning
DeepSeek’s release in January of powerful and economical AI models led to a temporary downturn in tech stocks. Amazon has since acted to address the market shift as new AI technologies emerge. The cloud giant is now working on its own reasoning model to directly compete with DeepSeek’s R1 according to a source. Amazon is looking into applying some of DeepSeek’s training methodologies for its new model.
AWS has also introduced internal guidance for employees, emphasizing privacy and security to customers when discussing DeepSeek. They are encouraged to propose AWS’s Nova AI models as an alternative. The guidelines also highlight Bedrock, which is meant to provide a more secure method for accessing AI models, with customer data not being shared with model providers, or being used to improve the models.
“DeepSeek’s privacy policy states they collect user data and may store them on servers in China,” the guidelines said. “We are aware of the privacy concerns on DeepSeek models.”
AWS has also instructed employees to leverage DeepSeek’s shortfalls in the company’s sales pitches. The guidelines claim that Nova offers both faster performance and improved security when compared to DeepSeek. According to the internal documents the Nova models have a closer performance comparison to DeepSeek’s V3 model rather than the R1 reasoning model and have different applications. The V3 is a “text-only model” whereas Nova offers image and video support.
Internal Discussions and Future Directions
After DeepSeek’s unveiling stirred the stock market in January, Amazon workers created an internal Slack channel dubbed “Deepseek-interest,” which rapidly drew in over 1,300 employees. This channel quickly sparked discussions on topics like security concerns, use cases, and competition. One employee wrote on the channel about his surprise at the lack of pushback against DeepSeek, given its Chinese origin. Now, Amazon has advised employees not to use DeepSeek on work devices.
Innovation and Competitive Learning
During a call in February, Jassy stated that Amazon was “impressed” with DeepSeek’s training methods, including their use of “flipping the sequencing of reinforcement training” and some of its “inference optimizations.” He added, “For those of us who are building frontier models, we’re all working on the same types of things and we’re all learning from one another.”
As the AI field rapidly evolves, some Amazon employees are already looking beyond DeepSeek. One employee wrote in the internal Slack channel that AWS should consider alternatives like Alibaba’s Qwen. “DeepSeek is already the past day,” the employee said.” When do we have Qwen2.5-Max?”