Serverless: Beyond the Backendless Future

Exploring a True Server-Free Architecture

The concept of serverless typically evokes the idea of offloading server management to a third-party provider. Yet, imagine a scenario where serverless is taken literally—no backend servers at all. A traditional web app running entirely without backend infrastructure seems improbable. Nonetheless, let's delve into a ServerFree™ architecture where this notion is a reality.

The Birth of the ServerFree™ Architecture

At its core, the ServerFree™ architecture is a radical deviation from traditional web application structures. It eliminates the need for backend servers, containers, and virtual machines, leaning solely on the client side to handle all operations, including those normally reserved for a server.

The Classic Architecture Approach: A Starting Point

The initial foray into the application's design utilizes classic architecture with subZero libraries, focusing on creating a robust database schema and a user-friendly layout. The MVP V1 includes custom components for an "Opportunities" page, a rich dashboard experience, and even a detour with Turso DB—a means to package the classic architecture for production deployment.

Steps Toward a Simplified Design

  • Database Schema Generation: Using npx @subzerocloud/scaffold@latest new for schema creation.
  • Server Configuration: Setting up server.ts to route requests.
  • Layout Adjustments: Making the sidebar more efficient by moving it to the top.
  • Enhanced Dashboard: Featuring open opportunities and average application progress.
  • Turso DB Integration: A SQLite based solution allowing deployment without data loss.

The Journey to ServerFree™ Architecture

Here begins the challenging yet innovative transition to a server-free environment. The future is visualized with SQLite, now compiled into WebAssembly, so that everything runs locally in the browser.

Constructing a WebAssembly-Based Application

  • SQLite with WebAssembly: Stores data using the Origin-Private FileSystem (OPFS), a new browser capability.
  • Web Worker Implementation: Back-end code is executed in a web worker to utilize SQLite with OPFS effectively.
  • Service Worker Role: Initially considered for running the entire backend, it instead intercepts UI requests due to OPFS limitations.
  • Main Thread Adaptation: The UI no longer handles authentication since the system operates on locally authenticated user files.

The Exhilarating Struggle

The path to the ServerFree™ concept started with a straightforward job application tracker. However, privacy concerns shifted the focus towards a complex exploration of web workers, service workers, and other browser technologies. This unexpected journey resulted in the unintentional birth of a potential new architectural paradigm.

Revolutionary Advantages and Potential Use Cases

The proposed ServerFree™ architecture promises several alluring benefits, particularly its no-backend-servers approach, heightened privacy due to local data handling, robust security by nature of direct computer authentication, and potential efficiency gains by offloading work usually done by a backend.

Conclusion

In this exciting proposal, we witness a hypothetical shift in web application development—a paradigm where servers are truly redundant, and local browser capabilities reign supreme. The ServerFree™ architecture champions privacy and efficiency, setting the stage for novel applications and a future where boundaries between the web and local computing blur into one.


#serverless #webassembly #architecture #applicationdevelopment

https://subzero.cloud/blog/serverfree-architecture/

Exploring the Evolution and Trends of Databases for Serverless and Edge Computing

As developers build applications with serverless and edge computing, there is a need for innovative tools to support this transformation. This article focuses particularly on databases that support this paradigm shift. The focus will be more on transactional workloads rather than analytical workloads, considering how massive the “backend” space is, including search, analytics, data science, and more.

The following are the criteria for this overview:

  • Services which pair exceptionally well with serverless and edge computing
  • Services that support JavaScript and TypeScript codebases

New Programming Models for Modern Applications

Traditional relational databases have been around for years, but serverless-first solutions require a new programming model. This new model should ideally leverage connectionless solutions, be web native and lightweight. Developers now prefer thin client libraries and an infrastructure that abstracts complexities like connection pooling or caching.

For a bonus, developers now favor databases or libraries which provide tooling to enable type-safe access to your data. Examples of such tools are Prisma, Kysely, Drizzle, Contentlayer, and Zapatos.

Solutions like Neon and Supabase have emerged to abstract connection management for databases like Postgres, providing developers with a simplified means to query and mutate data. The process involves using a client library that works with an HTTP API for Supabase or a special proxy for Neon.

While using WebSockets might introduce additional latency, they are faster for subsequent requests. Connection management, rather than going away, is now being handled by the vendor. Take PlanetScale for example, they can handle up to a million connections, effectively taking connection management worries off developers’ hands.

Emerging Trends for Database Companies

The evolving programming model has spurred the following key trends in the database industry:

  • Data Platforms – Databases are increasingly transitioning into data platforms to accommodate adjacent solutions like full-text search and analytics.
  • Decoupling of Storage and Compute – Inspired by companies like Snowflake, an increasing number of players in the industry like Neon, are decreasing the cost of a “database at rest” by decoupling storage and compute.
  • Infinite Scaling Solutions – Solutions like DynamoDB have made it possible to scale infinitely without the need to adjust memory, storage, CPU, clusters, and instances.
  • Global Data – The availability of specialized data APIs and user-specific data stores have made global data a reality.
  • Serverless Solutions – More databases are embracing serverless; however, what “serverless” means to various companies varies somewhat.

To help you better understand your options, I have categorized the solutions based on whether they are “established” or “rising”, whether they are serverless/serverful, as well as their level of maturity (i.e., whether they are generally available (GA) or pre-GA). Below are some examples:

Established

Firestore – a well-adopted document database with built-in support for authentication, real-time workloads, and cross-platform support for mobile.
MongoDB Atlas Serverless – has an entire data platform, including search / analytics / etc.

Rising

Convex – very useful for real-time workloads, but also has a simple, type-safe interface for querying/mutating data.
Grafbase – If you love GraphQL, Grafbase is worth exploring.
Neon – Provides Postgres with separation of storage and compute.

Other Solutions

  • Caching Engines: Stellate, Prisma Accelerate, ReadySet.
  • Cloud Provider Offerings: AWS Dynamo, Azure SQL, Azure CosmosDB, Google Cloud SQL, Google BigTable, and more.
  • Content Management (Headless CMS): These can still act as a database, just a different domain-specific solution. Sanity, Contentful, Sitecore, and more.

Feedback is very much welcome. Who have I missed? Of these services, which ones have you tried and liked?

Special Thanks

A special thanks to Guillermo Rauch, Paul Copplestone, Fredrik Björk, Anthony Shew, Craig Kerstiens, Jamie Turner, Nikita Shamgunov, Yoko Li, Pratyush Choudhury, Stas Kelvich, Enes Akar, and Steven Tey for reviewing this post.

Subscribe to Optimism (for the web) to learn more about tech and web development insights.

Tags: #Databases, #Serverless, #EdgeCompute, #ProgrammingModels

Reference Link

Serverless Compute in 2023: Top Trends, Challenges & Adoption Patterns in AWS, Google Cloud and Azure

In the ever-evolving landscape of computing, serverless has undeniably established itself as a central pillar. The driving force behind this transition is the growing availability of serverless offerings from major cloud providers such as Amazon Web Services (AWS), Google Cloud, and Azure, along with emerging platforms like Vercel and Cloudflare.

This report provides a comprehensive analysis of how over 20,000 organizations are utilizing serverless technologies in their operations, exploring significant trends and insights drawn from real-world applications of this transformative technology.

Shift Toward Serverless Adoption

Significant growth has been observed in serverless adoption among organizations operating on Azure and Google Cloud, with AWS also showing positive development. For instance, 70% of the AWS customers and 60% of Google Cloud customers now use serverless solutions. Azure isn’t far behind, with 49% of its customers embracing serverless offerings.

This upswing can be attributed to the expanding suite of serverless tools, ranging from FaaS solutions to serverless edge computing, offered by these cloud providers to meet their customers’ unique needs.

The Rise of Container-Based Serverless Computing

Google Cloud, since its launch of Cloud Run in 2019, has led in fully managed container-based serverless adoption. However, this year AWS saw a rise to 26% of serverless organizations running containerized Lambda functions and AWS App Runner. Azure also experienced considerable year-over-year growth, propelled by the launch of Azure Container Apps.

Container-based serverless compute platforms are gaining traction as they facilitate serverless adoption and migration by enabling organizations to deploy existing container images as microservices. Apart from that, these platforms offer wider language support and larger application sizes.

Serverless Platforms: Beyond The Major Providers

While major providers dominate the serverless space, frontend development and Content Delivery Network (CDN) platforms like Vercel, Netlify, Cloudflare, and Fastly also equip developers with specialized serverless compute capabilities. Interestingly, 7% of organizations monitoring serverless workloads in a significant cloud are also running workloads on one or more of these emerging platforms.

Choice of Languages for AWS Lambda

Node.js and Python are the languages of choice for most AWS Lambda developers, with over half of invocations being written in these languages. The rising popularity of custom runtimes indicates a growing interest in serverless containers, which allow developers to work with languages not natively supported by Lambda.

The Challenge of Cold Starts

Cold starts, where a new execution environment is created to serve a request, remain a significant concern. This is especially true for Java-based Lambda functions, which showcase the longest cold start times due to the JVM and Java libraries’ loading time.

The Adoption of AWS Lambda on ARM

The usage of AWS Lambda on ARM has doubled in the past year, primarily due to its combined benefits of faster execution times and lower costs.

Deployment Tools for AWS Lambda

Infrastructure as Code (IaC) tools like the Serverless Framework and Terraform greatly simplify the deployment and configuration of Lambda functions and other resources. As organizations mature and scale, the preference for IaC tools shifts. Larger organizations positively inclined towards Terraform for multi-cloud support and flexibility.

Connection of AWS Lambdas to a Virtual Private Cloud (VPC)

The complexity of integrating serverless functions across the existing infrastructure has led many organizations to connect their Lambda functions directly to the VPCs. According to recent statistics, 65% of Datadog customers have at least five Lambda functions connected to a dedicated VPC in their AWS account.

Serverless technologies today are making developer’s lives easier by being more secure, cost-effective, flexible, and efficient. The prominence of serverless in modern application building is only expected to surge further in the coming years.

Tags: #Serverless #AWSLambda #GoogleCloud #Azure #Terraform #Containerization #VPC #Nodejs #Python #ARM

Reference Link

Exploring the Benefits and Use Cases of Serverless Architecture in Cloud Development

When it comes to modern software development in the cloud, serverless applications hold undeniable advantages over traditional applications. The serverless approach allows developers to focus more on the unique features of their applications and less on common maintenance tasks such as OS updates and infrastructure scaling.

The Serverless Landscape

The serverless landscape is largely dominated by Function as a Service (FaaS) providers, with the three largest ones being Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform. These providers take care of all the infrastructure-related work, thus eliminating infrastructure as a potential point of failure and efficiency bottleneck.

When to Consider a Serverless Approach?

Serverless architecture is not always the ideal choice for every software development project. However, it may be worth considering if your circumstances fall under these categories:

  • The development of small to mid-sized applications
  • Loads are unpredictable
  • The application is amenable to quick (fail-fast) experimenting
  • The team has the requisite skills to leverage serverless advantages

When Serverless Might Not Be the Right Fit?

Conversely, serverless architecture may not be optimal for your project if:

  • Workloads will be constant
  • You anticipate long-running functions
  • You plan to use programming languages not supported by serverless platforms

Common Serverless Use Cases

Serverless architecture often finds use in:

  • Big data applications
  • Web applications
  • Backend services
  • Data processing
  • Chatbots and virtual assistants like Amazon Alexa and Google Assistant
  • IT automation

Monitoring Tools for Serverless Architecture

While serverless makes infrastructure management a breeze, there’s still the need to be able to monitor your system effectively. Thankfully, there are numerous tools developed specifically for serverless monitoring tasks, which assist in keeping track of your serverless systems.

The Verdict on Serverless Architecture

Migrating legacy apps to a serverless architecture or adopting serverless computing for new projects should only be undertaken after careful deliberation, taking into account the specifics of the project and its alignment with the benefits serverless architecture offers.

Stay tuned, as we dig deeper into AWS Lambda serverless architecture in the upcoming article.

Tags: #Serverless #CloudDevelopment #FaaS #AWS #Azure #GoogleCloud

Reference Link

Shifting Paradigms: Transition from Microservices to Serverless Computing in Software Development

In the ever-evolving landscape of software development, we often face various challenges while dealing with traditional 3-tier architecture applications. Issues may range from setting up servers, installing operating systems and necessary software, managing servers, designing applications with high availability and fault tolerance, and managing load balance – each of which may result in additional expenditure on infrastructure resources.

Understanding the Journey

Monolith applications, despite their comprehensive nature, have certain drawbacks – they are highly dependent, language/framework dependent, pose enhancement difficulties and scalability issues. In order to counter these shortcomings, there has been a shift towards microservices and serverless architectures.

What is Microservices?

Coined by James Lewis and Martin Fowler, microservice architectural style refers to when a singular application is developed in the form of a suite of small services, each of which runs in its individual process and communicates with each other through lightweight mechanisms such as HTTP resource API. Minimum centralized management of these services allows them to be written in multiple programming languages and different data storage technologies can be leveraged.

The Leap Towards Serverless Architecture

Ironically, the term ‘serverless architecture’ doesn’t imply the absence of a server. In fact, your application continues to operate on a server, but the distinction lies in server management and creation – you aren’t responsible for it. The serverless providers take care of everything while you focus solely on the code.

Although a serverless application shares characteristics with a microservice, it isn’t identical. While a microservice is larger and could encompass single or multiple functions, a serverless application depends on an event-driven function which consists of a small, specific code fragment.

Breaking Down Serverless Computing

Serverless computing has become the trendiest architecture in the software industry today. This architecture liberates developers from the responsibility of server and backend infrastructure management. Also, by adopting serverless compute service, developers can build loosely coupled, reliable, and scalable applications with ‘faster time to market’.

Essential Serverless Design Principles

In order to leverage serverless computing effectively, developers must adhere to its fundamental design principles:

  • On-demand execution: Serverless functions execute code only when necessary.

  • Stateless single-purpose functions: These facilitates improved debugging and testing as they are small, separate, units of logic.

  • Push-based, event-driven pipeline: This implies that each function performs a specific task driven by events.

  • Heavy and powerful front-end: Here, any static front-end can interact directly with the cloud services.

  • Use of third-party services: Helps sustain scalable applications that require high-bandwidth pipelines or use complex logic.

Conclusion

Serverless architecture is a crucial paradigm: it enforces more efficient scaling, is highly available, easily deployable reduces latency time and cost. Moreover, developers have more time for core development due to the reduction of infrastructure maintenance responsibilities.

Nonetheless, this paradigm shift isn’t devoid of its challenges. From a business standpoint, since serverless architecture is managed by external providers, there’s less control over server-side, increasing risk involvement. Not to mention, adopting a serverless provider necessitates addressing vendor lock-in. From a developer’s standpoint, handling and implementing functions for large applications might be time-consuming. Moreover, management of numerous functions might be challenging, risking the creation of mini-monoliths. Along with this, reliance on third-party providers for monitoring and debugging tools is unavoidable, this often leads to a dearth of operational tools.

That said, the acceptance and success of serverless architecture are hugely dependent on the business requirements rather than simply on the technology. When used appropriately, serverless can indeed do wonders.

Tags: #Serverless #Microservices #SoftwareArchitecture #AppDevelopment

Reference Link

Exploring Innovation and Overcoming Challenges in Serverless Architecture

Serverless architecture, also known as cloud computing execution model, has been widely adopted due to its efficiency and cost-effectiveness. It allows developers to build and run applications and services without concerning about the server infrastructure management.

The Evolution

Serverless computing model feels like you are using a software-as-a-service application within your application architecture. It’s designed to take the distractions away and help developers focus on coding. One of its most admired features is functions-as-a-service (FaaS) or cloud functions. This is where you can run your server-side software without worrying about server infrastructure such as Kubernetes clusters or virtual machines. AWS Lambda, known for FaaS, is quite popular among serverless users, yet there is more to serverless than only AWS Lambda.

Flavors of Serverless

Recently, application hosting becoming more specialized is seen as a new trend in serverless. Services like Vercel or Netlify, which host your websites or Next JS applications, are examples where they manage your applications for you and still are considered serverless.

Some users refrain from writing much custom server-side code and rely on independent custom services. They use a third-party authentication system instead of their databases and libraries. Joe Emison, who is exploring this concept in his upcoming book, is one such enthusiast. He integrates numerous third-party services through a front-end application to depict a modern way of using serverless.

Enterprise-ification of Serverless

While serverless is hot in the market, its definition is going through a wave of changes. One such emerging trend is the “enterprise-ification” of serverless. An array of serverless-ish version of services being launched by Amazon is an example of that. These services, even though automated, don’t have a zero-cost floor, raising questions on their “serverless” tag.

Enhanced Features

Lambda, to cater increasing demands, has added several features some of which are SnapStart (to reduce cold start times) and an option to lock down the runtime version. These additions are meant to entice developers to run more workloads on Lambda.

Serverless Development and Challenges

Serverless development has its unique challenges. Common misconceptions like major changes in CI/CD with serverless are rampant. However, it’s the application architecture needing major revisions due to the removal of always-on servers.

Choosing the granularity of the service also generates debate among serverless developers. Should an application with 20 tasks have 20 Lambda functions or one function managing all 20 tasks? Seems like a simple question but the answer varies significantly.

Furthermore, the costs of serverless may seem higher due to visible pricing models but the overall cost, including the reduced labor in managing the Kubernetes environments, may be more cost-effective.

Wrapping Up

Serverless brings in a dramatic shift in how applications are built and managed. The complexity and costs associated are less tangible and upfront. However, the agility, scalability, and cost-effectiveness make it a worthy consideration. Architecting applications with serverless may take time initially but the long-term benefits are immense.

Tags: #ServerlessArchitecture, #AWSServerless, #LambdaFunctions, #EnterpriseServerless

Reference Link