DEV Community

loading...

Help With This Software Architecture

Chad Smith
Work with C# all day everyday creating web applications. Huge baseball buff.
・1 min read

I had a question and wanted to get some software architecture ideas to see how others would go about designing this system out.

The basic idea/flow of the system is as followed:

  • User uploads a file to a server, which will store it on some cloud storage, like AWS S3.
  • Once file has been transferred to S3 it needs to get processed by a specific on premise server. If that server is currently busy processing another file it needs to wait in a queue.
  • The on premise server doing the processing needs to read the files from the queue and process them.
  • Once a file is finished processing the results need to be uploaded back to the storage, and we need to notify the user that everything is done.
  • Throughout all of this some status updates to the user would be nice to see. For example: "Waiting to be processed..." "File is being processed..." "Results Ready."

I guess some general notes: The on premise machine doing the processing would be a Linux box.

I'm just looking for some general ideas on what some others would be looking at. Any ideas are great. Thanks!

Discussion (3)

Collapse
andreasjakof profile image
Andreas Jakof

reads straight forward to me...
have a database with the file id, path, user and status and update it there.

Upload: put file to storage, insert DB: FileId/Path, user, "waiting for processing"

You also have built a queue, so the on-premise job can go into the db, load the file and update the status to "processing".

When done, upload the file and update the database with "maybe new file path", "Processed".

Not sure, why the processing job needs to be on premise, you could have it in the cloud as well using docker or just some PaaS.
With Azure (dont know about other clouds) you could even trigger an AzureFunction (and from there a WebJob) with the upload using an EventTrigger.

Collapse
geekwhocodes profile image
Ganesh Raskar • Edited

Here's my two cents:
Tech - Azure Storage Account, Azure relay, Azure functions

  1. Upload file to a storage account with container(files to process)
  2. Function will listen to file upload event on 'files to proccess container). This function will notify to your on-premise server about the file via azure relay(you can fulfill your BL/rules here : checking server status, file status etc.)
  3. Server will download file from storage, process it and upload it back(processed files)
  4. Another function will listen on 'proccessed files' container and notify user about the correct status.

You can use dynamic containers if you want but then you will need to maintain file status(Pending, Proccessed etc) in table storage or database.

Collapse
yaser profile image
Yaser Al-Najjar

I think a normal server (for processing), S3 (for storage), and SQS (for queue) is all what you need.

Not sure though of the caveats you have in your specific use case.