DocumentationLogin
Enterspeed logo
Enterspeed Blog
Product

What is Fair Processing? 🤔

Emil Rasmussen
Emil Rasmussen
CTO at Enterspeed
Thumbnail for blog post: What is Fair Processing? 🤔

Enterspeed processes a lot of data in our Middle Layer. To ensure that every tenant and client is given equal processing time and resources, we implemented what we call Fair Processing. This means that no one customer will be able to hog resources and cause delays for others. 

Most often, processing is very straightforward: One source entity is ingested, and a transformed view is ready for the Delivery API in a few seconds. No muss, no fuss.  

But sometimes, the processing becomes complicated.  

For instance, in the case of a schema deployment or a re-ingest of all source entities where the sheer volume of data might result in thousands of processing jobs that makes instant processing time near impossible.  

Noisy Neighbours  

In our Middle Layer, data processing is built as a FIFO queue (First In, First Out). As our layer is used to run multiple projects for multiple customers, we might run into incidents of backlogged processing queues that can affect processing times for other customers. Processing times are not always instantaneous and that can become even more complicated when processing thousands of processing jobs.  

“The noisy neighbour problem” is a common issue that arises when one customer can affect others in a shared environment with shared resources.  

By using our fair processing queue, we can regulate how much processing time is available for different classes of customers and different types of processing jobs. This ensures that every tenant receives an equal share of the available processing time and resources, making the processing queue fair for everyone.  

In other words, fair processing allows us to remove the noisy neighbour problem so all customers will experience that ingesting a source entity results in a generated view a few seconds later. 

Photo of man wearing a cap saying Love your neighbour

No more over-processing  

Another issue Enterspeed addresses in our fair processing queue is "over-processing".  

Over-processing happens when updating one source entity results in a spike in processing jobs, resulting in the re-processing of too many source entities. By employing a job de-duplication feature, we can detect duplicate jobs within the same batch of processing jobs and reduce the overall number of jobs.  

This saves both time and cost and is more sustainable. 

Intelligent cache 

So, the fair processing layer ensures that processing times are efficient and sustainable for all customers. It also eliminates the noisy neighbour problem, ensuring that each customer gets their fair share of processing resources without negatively impacting others.  

Additionally, the introduction of job de-duplication has reduced processing jobs and resulted in significant savings in time, costs, and sustainability.  

These improvements are part of Enterspeed's efforts to provide next generation caching that optimises performance, reduce costs, and minimises the environmental impact of processing. 

Want to know more about how we process data in Enterspeed? Check out our key features Ingest API, Schema designer, and Delivery API.   

 

 

Photo by Prateek Srivastava on Unsplash
Emil Rasmussen
Emil Rasmussen
CTO at Enterspeed

20 years of experience with web technology and software engineering. Loves candy, cake, and coaching soccer.

Ready to try out Enterspeed? 🚀

Start combining & connecting your services today

Product

Why Enterspeeed?Use casesBuild vs. buyIntegrations

Company

Partners ☕ Let's talk!About UsContact UsTerms of ServicePrivacy PolicySecurity
Enterspeed logo

© 2020 - 2024 Enterspeed A/S. All rights reserved.

Made with ❤️ and ☕ in Denmark.