AWS Lambda
Serverless event-driven computing.
AWS Lambda is a serverless, event-driven compute service provided by Amazon Web Services (AWS). It allows developers to run code without provisioning or managing servers. Developers upload their code as Lambda functions, and AWS automatically handles the infrastructure required to run and scale that code with high availability. Functions are triggered by various events, such as changes to data in an Amazon S3 bucket, updates to an Amazon DynamoDB table, or [API Gateway](/en/terms/api-gateway) requests. When a trigger occurs, Lambda executes the function, allocating the necessary compute resources. Billing is based on the number of requests and the duration/memory consumed during execution, with a generous free tier. Key features include automatic scaling, stateless execution environments (though state can be managed externally), support for multiple programming languages (Node.js, Python, Java, C#, Go, Ruby, PowerShell, and custom runtimes), and integration with a vast array of AWS services. Lambda functions are typically short-lived, designed for specific tasks rather than long-running applications. Trade-offs include potential cold starts (latency when a function hasn't been invoked recently), execution time limits, limitations on memory and temporary storage, and vendor lock-in to the AWS ecosystem. It's ideal for event processing, microservices, data transformation, and backend APIs.
graph LR
Center["AWS Lambda"]:::main
Rel_api_gateway["api-gateway"]:::related -.-> Center
click Rel_api_gateway "/terms/api-gateway"
Rel_microservices["microservices"]:::related -.-> Center
click Rel_microservices "/terms/microservices"
Rel_serverless["serverless"]:::related -.-> Center
click Rel_serverless "/terms/serverless"
classDef main fill:#7c3aed,stroke:#8b5cf6,stroke-width:2px,color:white,font-weight:bold,rx:5,ry:5;
classDef pre fill:#0f172a,stroke:#3b82f6,color:#94a3b8,rx:5,ry:5;
classDef child fill:#0f172a,stroke:#10b981,color:#94a3b8,rx:5,ry:5;
classDef related fill:#0f172a,stroke:#8b5cf6,stroke-dasharray: 5 5,color:#94a3b8,rx:5,ry:5;
linkStyle default stroke:#4b5563,stroke-width:2px;
🧠 Knowledge Check
🧒 Explain Like I'm 5
Think of AWS Lambda like a magic chef that only cooks when you ask for a specific dish (an event happens). You give the chef the recipe (your code), and they instantly cook it using the right tools, charging you only for the time they spend cooking.
🤓 Expert Deep Dive
AWS Lambda operates on a container-based execution model. When a function is invoked, Lambda provisions a micro-container, loads the function code, and executes it. For infrequently invoked functions, this provisioning step introduces 'cold start' latency. Subsequent invocations within a short timeframe reuse the existing container ('warm start'), significantly reducing latency. Lambda functions are inherently stateless; any required state must be persisted externally (e.g., in DynamoDB, S3, RDS). The concurrency model allows multiple instances of a function to run in parallel, managed by AWS to meet demand up to account-level concurrency limits. Provisioned Concurrency can be used to mitigate cold starts for latency-sensitive applications by keeping a specified number of execution environments warm. Security is managed via IAM roles, granting functions granular permissions to access other AWS resources. The event source mapping mechanism allows Lambda to poll event sources (like Kinesis streams) and invoke functions with batches of records. Architectural considerations include optimizing function memory allocation (which also affects CPU allocation), managing dependencies, and designing for idempotency due to potential retries.