Today, we're excited to announce updates to Nova Act SDK. Since our research preview, we've seen incredible adoption from developers building complex, reliable agents. We're adding new enterprise-grade capabilities to our preview through AWS integration, enabling select customers to move from prototype to production with 90%+ end-to-end reliability on early enterprise use cases. Interested in productizing your agent prototype? Learn more: https://amzn.to/40PkFfq
Amazon Science
Research Services
Seattle, Washington 376,206 followers
The latest news and research from Amazon’s science community. #AmazonScience
About us
Amazon Science gives you insight into the company’s approach to customer-obsessed scientific innovation. Amazon fundamentally believes that scientific innovation is essential to being the most customer-centric company in the world. It’s the company’s ability to have an impact at scale that allows us to attract some of the brightest minds in artificial intelligence and related fields. Our scientists continue to publish, teach, and engage with the academic community, in addition to utilizing our working backwards method to enrich the way we live and work. Follow us on LinkedIn and visit our website to get a deep dive on innovation at Amazon, and explore the many ways you can engage with our scientific community. #AmazonScience
- Website
-
https://www.amazon.science
External link for Amazon Science
- Industry
- Research Services
- Company size
- 10,001+ employees
- Headquarters
- Seattle, Washington
- Founded
- 2020
- Specialties
- Artificial Intelligence, Machine Learning, Computer Vision, Cloud, Economics, Sustainability, AI, ML, Conversational AI, Natural Language Processing, NLP, Robotics, Security, Privacy, Information, Knowledge Management, Operations, Scientific Research, Search, Amazon, and Alexa
Updates
-
To improve LLM safety, Amazon researchers demonstrate that using multiple AI agents to generate and refine "chain of thought" training data boosts benchmark performance by 29% on average. Presented at ACL this week, the findings demonstrate how this approach enhances AI reasoning capabilities while improving safety and policy compliance: https://amzn.to/3IXrhlV
-
-
A new cost-to-serve-software metric reveals hidden inefficiencies across the entire software development lifecycle. Amazon researchers demonstrate how AI-powered tools are transforming developer efficiency, providing quantifiable evidence of reduced costs in software delivery: https://amzn.to/4l8niR1
-
-
We're excited to kick off ACL in Vienna, Austria, where Amazon will be showcasing cutting-edge work in computational linguistics and natural language processing. Our nearly 30 accepted publications span spoken language models, vision-language grounding, retrieval-augmented generation systems, and reinforcement learning approaches. Learn more: https://amzn.to/4f8MTry #ACL2025 Association for Computational Linguistics
-
-
In this thought-provoking analysis, Amazon AGI Lab scientist Dr. Danielle Perszyk challenges conventional wisdom about the future of AI-human interaction. Her team is reimagining the relationship between knowledge workers and machines. Rather than pursuing AI that thinks for us, Amazon researchers are developing agents that enhance human cognition while preserving human agency. She explores: • Why digital tools often impede rather than improve productivity • The vision of AI as humanity's "collective subconscious" • How Amazon Nova Act is building foundations for reliable human-agent collaboration Learn more: https://amzn.to/4kZgi91
-
Announcing the winners of the Amazon Nova AI Challenge! 🏆 After six months of intense competition, we're proud to celebrate Team PurpCorn-PLAN (UIUC) as Defending Champions and Team PurCL (Purdue) as Attacking Champions, along with runners-up Team AlquistCoder (CTU Prague) and Team RedTWIZ (Nova Lisbon). This inaugural challenge showcased new safety techniques, adversarial tools, and multi-turn evaluation methods that push the frontier of secure, trustworthy AI. Learn more: https://amzn.to/4nYITxH
-
-
Introducing Mitra: a foundation model from Amazon researchers that outperforms traditional methods for tabular data by learning from diverse synthetic priors. Available in AutoGluon 1.4 soon, Mitra uses in-context learning to adapt to new tasks without requiring separate models for each dataset: https://amzn.to/4lBeowh
-
-
Amazon researchers developed a new architecture that reduces a foundation model's inference time by 30% while maintaining its accuracy. Like specialized regions in the brain, this new system selects appropriate subsets of neurons depending on the task: https://amzn.to/3IKPA6p
-