Quick Facts
- Category: Reviews & Comparisons
- Published: 2026-05-04 20:40:10
- How to Build Accessible Tooltips with the Native Popover API
- Securing Apple Devices at Work: Key Mobile Threats and How to Mitigate Them
- Kennedy Space Center Director Janet Petro Announces Retirement After Transformative Tenure
- How to Unravel the Mysteries of Magic's Reality Fracture Set: A Step-by-Step Guide
- How to Access, Build, and Explore MS-DOS 1.0's Historic Source Code
Introduction
As organizations accelerate their adoption of AI on Amazon Bedrock, finance and engineering leaders face a familiar challenge: understanding exactly who is spending what. Without granular cost visibility, teams struggle to optimize model usage, allocate budgets, and prove ROI. That’s where the new IAM principal cost allocation feature for Amazon Bedrock comes in. This guide walks you through setting it up step by step, so you can map model inference costs back to specific users, roles, or teams. You’ll also learn how to complement this with AWS Agent Registry for agent governance and how to access the cutting-edge Claude Mythos model preview. By the end, you’ll have a clear, actionable path to mastering AI cost management.

What You Need
Before you begin, ensure you have the following prerequisites:
- An active AWS Account with Amazon Bedrock enabled in at least one region (e.g., us-east-1).
- Permissions to create and manage IAM users, roles, and tags (typically
iam:*or specific tag-related actions). - Access to the AWS Billing and Cost Management console. You’ll need billing administrator privileges to activate tag-based cost allocation.
- Familiarity with AWS Cost Explorer or the Cost and Usage Report (CUR) to analyze the resulting data.
- For optional extras: AWS CLI or SDK installed (to interact with Agent Registry) and an approval for the Project Glasswing research preview if you wish to use Claude Mythos.
Step-by-Step Instructions
Step 1: Identify IAM Users and Roles to Tag
Start by listing the IAM principals (users or roles) that are accessing Amazon Bedrock. These could be developers building AI agents, data scientists running foundation model evaluations, or automated CI/CD pipelines. Use the IAM console to review existing users and roles, or plan new ones if needed. For each principal, decide on the tags that align with your organizational structure—common examples are team, cost-center, project, or environment.
Step 2: Apply Tags to IAM Principals
Navigate to the IAM console and select a user or role. Under the Tags tab, add one or more key‑value pairs. For instance:
- team : engineering
- cost-center : 12345
You can apply tags using the AWS Management Console, AWS CLI (aws iam tag-user or aws iam tag-role), or infrastructure‑as‑code tools. Ensure that the tag keys you choose are consistent across all principals to simplify reporting.
Step 3: Activate Tags in Billing and Cost Management
Now go to the AWS Billing and Cost Management console. Under Cost Allocation Tags, find the tag keys you just created. Select each key and choose Activate. This tells AWS to start recording those tag values in your cost data. Activation can take up to 24 hours to take full effect, so plan accordingly. Once active, the tags appear in Cost Explorer and in the detailed Cost and Usage Report.
Step 4: View AI Costs by Tag
After activation, open AWS Cost Explorer. Create a new report and group by the tag key (e.g., team). You’ll see Bedrock model inference costs broken down by the tagged principals. Alternatively, query your Cost and Usage Report via Athena or QuickSight for deeper analysis. This visibility lets you track spending per team, cost center, or project, answering the key question “who is spending what on AI?”.

Step 5: (Optional) Explore AWS Agent Registry for Centralized Governance
While cost allocation is critical, managing AI agents at scale is another challenge. Amazon Bedrock AgentCore now provides a private catalog for agents, tools, and MCP servers. To use it, access the AgentCore Console. You can search semantically or by keyword to discover existing agents, helping teams avoid duplication. Approval workflows and CloudTrail audit trails ensure governance. Integrate the registry with your development workflow via the AWS CLI, SDK, or as an MCP server queryable from IDEs.
Step 6: (Optional) Request Access to Claude Mythos Preview
For advanced cybersecurity and reasoning tasks, you can request a gated preview of Anthropic’s Claude Mythos model through Amazon Bedrock. This preview, part of Project Glasswing, is limited to allowlisted organizations—priority is given to internet‑critical companies and open‑source maintainers. If your use case involves vulnerability discovery or complex code analysis, submit a request via your AWS account team. Once approved, you can invoke the model using the standard Bedrock API, now with cost visibility from your new tags.
Tips and Best Practices
- Plan your tagging strategy before you start. Choose tag keys that reflect your chargeback or show‑back model. Avoid personal identifiers—stick to team, project, or cost center.
- Automate tag application using IAM policies that require tags at resource creation. This ensures every new user or role is automatically accounted for.
- Monitor cost data regularly. Set up anomaly detection in Cost Explorer to catch unexpected spikes in Bedrock usage, such as a runaway agent loop.
- Combine IAM cost allocation with Agent Registry. Tag your agents similarly to users, so you can correlate agent‑level spend with the IAM principal that invoked it.
- Use the Cost and Usage Report for advanced analysis. Export it to Amazon S3 and query it with Athena for custom reports—this gives you the most granular data.
- Stay informed about preview models like Claude Mythos. They often require separate tagging considerations; confirm that your IAM principals have the necessary permissions to invoke the model.
With these steps, you’re now equipped to bring full cost transparency to your AI initiatives on Amazon Bedrock. No more guesswork—just clear, actionable insights that help you scale AI responsibly.