Optimizing Supplier Catalogs in eProcurement Systems_ The Quest for the Perfect Implementation

Optimizing Supplier Catalogs in eProcurement Systems: The Quest for the Perfect Implementation

In the previous article, we discussed the importance of integrating product catalogs into eProcurement systems. Today, my focus will be on the technical details.

It is pretty easy to look at the past and say that someone did something wrong and it could be done better and cheaper. But everyone knows how difficult it is to make decisions today. I can’t blame anybody for decisions made in the past, because I didn’t know all the reasons and requirements and possibilities that existed at that moment! My hope is that the people who work with my code will be tolerant of my decisions as well 😀 

It is just one side and another that it is still difficult to design a silver bullet. The main reason is that we are building the future. Nobody can say how will be better or how it should be for sure and we have to search blindly by making some ideas and moving forward step by step. Each idea generates the code we have to support in the future, and it can be very costly to change or remove.

So the question of how to improve the current system is very individual and requires deep investigation. But figuring out the best approach and defining a roadmap of how we can get there is a real task.

We spent a lot of time and resources on this, from synchronously importing small feeds that were simple and cheap, to understanding where we should be with new requirements from the market. And it’s certainly not the end, but we are ready to face new challenges!

Challenges Faced in the Past

Scalability Limitations

Scalability Limitations:
The system imported data in multiple threads, but we couldn’t scale the cluster horizontally, only vertically, and we couldn’t guarantee strict message ordering.

Lack of Flexibility

Lack of Flexibility:
We couldn’t provide enough flexibility by adopting a business model to follow market needs, because redesigning the system and migrating data was difficult and expensive. A data journal would help by adding new stateful consumers.

Complexity of Data Formats and Versions

Complexity of Data Formats and Versions:
The system supported different data formats and different versions for backward compatibility, and code complexity increased with each new feature.

Limited Process Recovery

Limited Process Recovery:
The system processed the entire feed at once. If it failed for any reason, we couldn’t provide recovery because we didn’t know which rows had already been processed, and the system couldn’t re-run it because we might get endless processes. In such cases, users had to restart the process themselves.

Introducing a New Approach: Asynchronous and Stateful Processing

To overcome these challenges, we have adopted an innovative approach that introduces asynchronous and stateful processing. This model provides the necessary stability and performance. It solves some fundamental problems in the design of such systems.

Untitled Diagram

Our new model significantly enhances the stability, performance, and adaptability of eProcurement systems. It tackles fundamental challenges in system design and offers the following advantages:

Asynchronous Processing
Our system now runs processes on request and returns an identifier to the requester, allowing them to check the status with subsequent requests. This eliminates timeouts and ensures smooth operation.

Efficient Data Handling
Our system reads the data feed line by line or object by object, preserving all information without loss and ensuring data integrity.

Atomic Line Processing
Each feed line or object is processed as an atomic operation, improving load balancing and data integrity.

Stateful Line Tracking
The system maintains processing states for each line, allowing us to determine when the last change occurred, improving data management capabilities.

Data Correctness Validation
To maintain data accuracy, our system validates data integrity and reports when improvements are needed.

Resource Optimization
Through intelligent checks for data changes, we save system resources when processing unchanged data, ensuring efficient resource utilization.

Guaranteed Delivery and Deduplication
Our system guarantees data delivery while avoiding duplicate row processing, further improving data accuracy.

Error Handling and Conflict Resolution
In cases of internal problems or data conflicts, our system supports the ability to reprocess individual lines, ensuring data consistency.

Smart Processing Delays
If a product associated with a line does not yet exist, our system can delay processing until the product is available, streamlining the entire process.

Event-Driven Microservices
Our system generates new events upon data storage, enabling seamless data processing across other microservices within the architecture.

Simplified Data Format
By supporting only one internal data format, we reduce complexity and minimize the overhead associated with data transformation.

Adaptability and Scalability
Our model is designed for easy adaptation to new requirements without significant performance impacts, ensuring future scalability.

Considerations and Future Enhancements

While our new approach offers numerous advantages, we acknowledge that there are still certain considerations and possible improvements:

Data Structure and Storage Migration

Data Structure and Storage Migration. Changes to data structure or storage may necessitate careful migration planning to prevent disruptions.

Line Lifecycle Management

Line Lifecycle Management. To avoid endless processing, close attention must be paid to managing the lifecycle of each line in the system.

Processing Order

Processing Order. While our system ensures efficient processing, the exact order of line processing may not be guaranteed in all cases.

Cost Implications

Cost Implications. The advanced architecture, while highly beneficial, comes at a higher cost than a simpler alternative. Cost-benefit analysis is vital.

The Road to Enhanced Performance and Capabilities

Other disadvantages relate to special business and performance requirements that can be met by a distributed event store and stream processing platform. It is more expensive, but will allow us to solve the following challenges if needed.

  1. Publish/Subscribe with Strict Message Order: Implementing a distributed event store enables publish/subscribe patterns with strict message ordering, even across multi-cluster systems.
  2. Flexible Consumer Read Options: With a stream-processing platform, new or existing consumers can start reading messages from any point in the queue, facilitating real-time processing.
  3. Fault-Tolerant Data Storage: By adopting a distributed event store, data can be stored in a fault-tolerant and durable manner, ensuring data resilience.
  4. Low Latency Message Delivery: A stream-processing platform enables low-latency message delivery, reducing time between publishing and receiving messages.

The Future Awaits

While the distributed event store and stream-processing platform represent a more advanced solution, it opens the door to limitless possibilities for businesses with unique requirements. As we continue to evolve and innovate, we eagerly anticipate your reactions and comments.

Thank you for your interest and engagement!

Take care!

Related posts

Let’s start building something great together!

Contact us today to discuss your project requirements and hire our dedicated development team. Together, we can revolutionize your eProcurement capabilities and drive your business forward.

tradeshift integrator team




    This site is protected by reCAPTCHA and the Google
    Privacy Policy and Terms of Service apply.

    SETRONICA


    Setronica is a software engineering company that provides a wide range of services, from software products to core business applications. We offer consulting, development, testing, infrastructure support, and cloud management services to enterprises. We apply the knowledge, skills, and Agile methodology of project management to integrate software development and business objectives effectively and efficiently.