DEV Community

Cover image for Integration thought iteration-The not Bad, the OK, and the Good Enough
Laslo Pastor
Laslo Pastor

Posted on

Integration thought iteration-The not Bad, the OK, and the Good Enough

In response to a recent request to enhance the pickup system for a transport and storage company, I embarked on a series of iterations to develop viable solutions. This article delves into the evolution of ideas and presents three potential solutions that were considered before identifying the one deemed "good enough" for implementation.

Problem

A storage company operates multiple locations where products are stored. When they receive shipping requirements, they must gather pickup orders from various locations to fulfill the required quantities of each product. The individual responsible for creating the pickup list must determine the closest storage location and verify if there is a sufficient quantity of the required product available. If the required quantity is not available, they need to check the next storage location until the quantity requirement is met. This process must be repeated for every product on the list.

Image description

Solution 'not Bad' - The first iteration

Image description

Exposing database data as a REST Data Service offers significant benefits, providing a straightforward way to access and manipulate data. Additionally, calling SQL procedures as a REST resource facilitates seamless integration into applications, simplifying the development process and enhancing overall efficiency.
However, ensuring controlled and secure access for external users requires the utilization of API management gateways.

The primary enhancement was implementing an SQL procedure that utilizes ZIP code search to locate the nearest storage facilities and populate the pickup list until a sufficient quantity of the requested product is not found.

CREATE DEFINER=db_753@73.232.207.137PROCEDUREcloud_demo.orderList`(IN productID INT, IN zip CHAR(5), IN qty DECIMAL(5,2))
BEGIN
DECLARE done BOOLEAN DEFAULT FALSE;
DECLARE c_zip VARCHAR(5);
DECLARE c_address VARCHAR(255);
DECLARE c_product_name VARCHAR(255);
DECLARE c_quantity DECIMAL(5, 2);
DECLARE totalQuantity DECIMAL(10, 2) DEFAULT 0.00;

DECLARE orderCursor CURSOR FOR
     select s.Qty, s2.zip, s2.Address,p.Name from 
     Stock s inner join Product p on s.product_id = p.id inner JOIN Storage s2 on s.storage_id = s2.id 
     where p.id = productID and s2.zip like zip
     ORDER by s.Qty DESC, s2.zip;

-- Declare handler for cursor
DECLARE CONTINUE HANDLER FOR NOT FOUND SET done = TRUE;

-- Temporary table to store product quantities
CREATE TEMPORARY TABLE IF NOT EXISTS tempProductQuantityPerZip ( 
    t_prod_quantity DECIMAL(5,2),
    t_zip VARCHAR(5),
    t_address VARCHAR(255),
    t_product_name VARCHAR(255)
);  

Delete FROM tempProductQuantityPerZip;
set totalQuantity = 0;

-- Open cursor
OPEN orderCursor; 
Enter fullscreen mode Exit fullscreen mode

-- Start iterating over the returned list
read_loop: LOOP
-- Fetch data from cursor into variables
FETCH orderCursor INTO c_quantity,c_zip,c_address,c_product_name;

    -- Check if cursor has reached the end of the result set
    IF done THEN
        LEAVE read_loop;
    END IF;

    IF totalQuantity < qty then
     INSERT into tempProductQuantityPerZip(t_prod_quantity,t_zip,t_address,t_product_name) VALUES (c_quantity,c_zip,c_address,c_product_name);
    END IF;

   SET totalQuantity = totalQuantity + c_quantity;

 END LOOP read_loop;

-- Close cursor
CLOSE orderCursor;

Select * from tempProductQuantityPerZip;
Enter fullscreen mode Exit fullscreen mode

END`

To create RDBMS Datasource, I used WSO2 MI
full db script can be found in the GitRepository

Solution 'the OK' - The second iteration

Image description
The primary issue with the initial solution is its limitation to process only one product request at a time. Furthermore, in the event of a database unavailability, requests may be lost. To address this, the new solution introduces an additional API that exposes an integration synapse named 'callDataService', enabling the handling of multiple product requests simultaneously.(with Iteration mediator).
To enhance resilience, the new API exposes a resource that stores requests in the Kafka message queue.(used Kafka connector), using a Kafka inbound endpoint, a specific topic is monitored to direct received message to the 'callDataService' integration.
The final challenge arose from receiving an array of information containing product IDs, quantities, and store details, request was to group this information by storage location and list the products to be picked from each location. For this we used
Script mediator (Big thanks goes to Arunan Sugunakumar).
The final addition was the development of the 'sendEmail' integration synapse, which accepts the payload, iterates through it, and sends an email to each store with information regarding the products and quantities that need to be prepared for pickup.(used Email connector)

Solution 'Good Enough' - The third iteration

Image description
Ultimately, to accommodate the final requirement of enabling requests through a JSON file dropped in an FTP location, I implemented an SFTP watcher. This watcher monitors files with a .json extension, parses them, and then sends them to a Kafka message queue.
To achieve I used File inbound endpoint connector

Note: don't forget to URL encode your password when creating a connection string (it took me a while to figure out why my connection wasn't working)

Summary:
All connectors used in the solution and much more can be found on th WSO2 store.
The whole MI solution can be found on GitRepository.
The journey of refining and enhancing your solution knows no bounds, and it all begins with that very first step. And as Gandalf the Gray said: "..It's a dangerous business, going out your door. You step onto the road, and if you don't keep your feet, there's no knowing where you might be swept off to..."

Top comments (0)