Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

IMHO leveraging serverless properly generally means writing tiny, stateless, ephemeral, single-responsibility, "pure" functions.

Unfortunately, what it most often seems to be used for is triggering side effects in response to some event, which is exactly the opposite of a pure function. Local testing of that logic is probably reliant on mocking all the other services it integrates with. Beyond that, the next step tends to be remote integration testing on the actual cloud hosting service, using some sort of separate staging infrastructure that will possibly be shared among many developers.

Some of the ideas and flexibility that should be offered by cloud hosting are appealing, but in practice, it often feels like we've regressed decades in our software development practices, discarding successful ideas like shorter feedback loops and unit testing and one-step build/deploy processes.



> the next step tends to be remote integration testing on the actual cloud hosting service, using some sort of separate staging infrastructure that will possibly be shared among many developers

One of the results of the pay-per-use pricing model of services utilized in serverless architectures is that you can easily provision dirt-cheap developer and feature-branch specific environments for integration-level testing. While you still can't attach a debugger in your function, the isolation from changes others are working on makes testing and exploration much more pleasant than when working in a shared dev environment. Things only get complicated when your architecture includes more expensive components with different pricing schemes that you probably want to share with others (e.g. relational databases, data streams).


Things only get complicated when your architecture includes more expensive components with different pricing schemes that you probably want to share with others (e.g. relational databases, data streams).

But how often will that not be the case, realistically? By their nature, serverless functions have no persistent storage or other capabilities of their own. All they can do without communicating with other services is return some data to the caller.

So, if you really are writing a lambda to do some calculation on demand, that's OK. However, literally anything else is going to be integrated with other services in way or another. That will include some sort of database or other persistent storage if there is data to be retrieved or saved for future use. It might involve services that provide other effects, whether sending an email or collecting a payment or querying someone's geographical database API and sending a message to a user's phone giving them the nearest dining options.

It used to be a pain testing integrations with third party online APIs, because most of them don't provide support for spinning up a fresh simulated environment on demand to test against. (Did I mention that as an industry we seem to have regressed by decades in our standard practices?) But with some of these cloud services, we create the same problems even with our own systems, which seems like madness to me.


I've been working on serverless applications for years now and the cases are quite rare TBH.

Static cost of data streams come up in data projects but even there it's trivial to consume data from shared streams in developer-specific environment since the stream pointer is managed in client side. The throughput required by even multiple occasionally consuming dev environments rarely requires provisioning streams above minimum capacity.

DynamoDB in on-demand mode is my go-to OLTP database until I know enough of runtime load characteristics to provision static capacity with autoscaling. For any non-prod (or load testing) environment on-demand mode is the best choice. Aurora Serverless is OK as well but has its warts and so far the cases where I really need a relational database in a greenfield project have been few and far between. Anything with complex query requirements goes to S3 and gets queried asynchronously or in batches via Athena. Not suitable for real-time cases though.

Things like queues, notifications and e-mails are handled by SQS or SNS and paid per usage. Websockets by API Gateway and Lambda.

Architecting for serverless requires a shift in mindset and tooling but as a benefit you get cheap isolated development environments and truly elastic production environments with very little ops burden.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: