DEV Community

GraphQL Tutorial - How to Manage Image & File Uploads & Downloads with AWS AppSync & AWS Amplify

Nader Dabit on June 28, 2019

How to create & query images and files using GraphQL with AWS AppSync, AWS Amplify, and Amazon S3 Storing and querying for files l...
Collapse
 
sakhmedbayev profile image
sakhmedbayev

Nader, when I set my S3 bucket to public per instruction I got a warning in AWS S3 console:

This bucket has public access
You have provided public access to this bucket. We highly recommend that you never grant any kind of public access to your S3 bucket.

Does it mean that anyone can use this S3 bucket???
Please let me know
Thanks

Collapse
 
dabit3 profile image
Nader Dabit

Yes, if you set a bucket of folder in a bucket public, anyone can read from that bucket. I put a warning letting the readers know about this as it is not recommended by AWS security policy, but many people ask for or want this functionality so I showed how it could be done as well.

Collapse
 
sakhmedbayev profile image
sakhmedbayev

Can public users of the S3 bucket just read from it or write into it as well? How to make it more secure?

Thread Thread
 
dabit3 profile image
Nader Dabit • Edited

Ah, no they would only be able to read from it using the instructions here. To make it secure from reads as well, see the other example I provided in this tutorial.

Collapse
 
miliu99 profile image
miliu99

Hi Nader, Thanks for this great tutorial. I got everything working except that the User uploaded images are not showing up (They are in S3 as I can tell) with "403 Fobidden" error in browser console even though I signed in. Can you tell what I missed?

Products images are shown without problem.

Collapse
 
rudyhadoux profile image
rudyhadoux

Same problem.

Collapse
 
jupitercow profile image
Jupitercow

I have the same issue. Can't seem to figure it out. I wonder what we missed...

Collapse
 
letsbsocial1 profile image
Maria Campbell

Just what I needed! Using multer with MongoDB simply did not cut it for me. Especially because you can only delete images/files locally and they are not deleted in S3! Looking forward to learning more about DynamoDB and what it can do for me especially when it comes to working with dynamic images NoSQL and graphql along with serverless. Thanks for this great post Nader!

Collapse
 
dabit3 profile image
Nader Dabit

Thanks Maria, glad you found it useful!

Collapse
 
letsbsocial1 profile image
Maria Campbell

Absolutely! Since I got you here I'll ask a question I was going to do in person tonight, but asking here will leave me more time to ask other questions. As far as dynamically deleting an image or file from the client that results in its deletion on S3, would the Amplify S3Image component do the trick since it renders an Amazon S3 object key as an image, and therefore I as I understand it, would make it possible to identify the image in question for deletion? Because the other examples regarding deleting files only show the deletion of individual, hard coded file names. But if one were to use the key approach, files would be dynamically deleted, right? Or am I getting this all wrong? Thanks in advance!

Collapse
 
Sloan, the sloth mascot
Comment deleted
Collapse
 
dabit3 profile image
Nader Dabit
  1. So there are two parts to accessing the S3Object, one from the bucket itself and two from the actual API. Typically the best security practice is to leave all images secure and only access them using a signed URL. The example I gave with private images is typically the use case I recommend. If we use @auth rules for owner, only the user uploading them image would be able to view it but in reality we want it to be available to any user of the app. Sure, we could set queries to null and allow anyone to access the location that of the image, but either way we ideally only want users accessing the image directly from our app to be successful.

  2. We actually have equal support for Angular & Vue. We now also have an advocate like me on our team who specializes in angular but does not write as much content, he's busy traveling around giving more workshops and talks. I think we see much more articles talking about React because I am very visible and active in that community, but in reality there is pretty much feature parity between the frameworks.

  3. I don't know the answer to this. If this is a feature you'd like, I'd suggest submitting an issue in the GitHub repo and we can see about putting it on our roadmap.

Collapse
 
Sloan, the sloth mascot
Comment deleted
 
dabit3 profile image
Nader Dabit • Edited

1 No, the @auth rules only apply to the GraphQL API not the S3 bucket for storage. The rules you mentioned will allow anyone to read from the database, but the a user still needs to be authorized to read from the S3 bucket in some way, either signed in or not, via the Amplify SDK (sends a signed request, gets a signed url that is valid for a set period of time)

4 Yes, we support multi auth now (starting last week) from the CLI -> aws-amplify.github.io/docs/cli-too...

5 You can update the API key by changing the expiration date in the local settings and run amplify push to update -> aws-amplify.github.io/docs/cli-too...

Thread Thread
 
Sloan, the sloth mascot
Comment deleted
 
dabit3 profile image
Nader Dabit
  1. Yes you can combine authorization rules. See details here

  2. Private access is built in to Amplify - See docs here referencing private access

  3. Yes, the process of storing would be the same, the only difference is you would need to deal with standard streaming / buffering protocols on the client that are agnostic to Amplify.

Collapse
 
mike0120 profile image
Mike Lu • Edited

I have seen some commants that say the upload feature is not working.
I also faced this issue but I resolved it due to configuring the amplify with
aws-exports.js.
Hopefully, it's a reference to everyone.

Collapse
 
rudyhadoux profile image
rudyhadoux

Hi Nader,

For the Private Access part, I have 403 to get images both for :

Storage.put( this.key, file, {
level: 'public',
contentType: 'image/*'
} )

Storage.put( this.key, file, {
level: 'private',
contentType: 'image/*'
} )

How to fix it ?

Collapse
 
rudyhadoux profile image
rudyhadoux

I have found the solution here :
itnext.io/part-2-adding-authentica...

We must just store only the document key and for each access use :
await Storage.vault.get(key) as string;

Collapse
 
abbatyya profile image
abbatyya • Edited

Hi Nader, i have two questions

  1. How could i accomplish the same using Datastore, i also follow your book (Full Stack Serverless) chapter 9 i could not find any solution.
  2. How could i store the file inside of a folder in a bucket, let say i have a bucket called 'mmsStorage', then inside the bucket 'mmsStorage' i create a folder call 'userProfiles', how can i store the image inside the folder 'userProfiles'

please i need your help ASAP Thank you.

Collapse
 
melf_clausen_a3a291176723 profile image
Melf Clausen

Thanks for the tutorial! But if I'm not mistaken, you never actually explain how to get a signed URL to access the Image. And when I google it, the process seems pretty complex. Am I missing something?

Collapse
 
newtostuff profile image
NewToStuff

Hi Nader,
Really interesting article. Its showing pretty much what I wish to do but in an Angular 10 environment. I have set up GraphQL using AWS Amplify.API and was hoping to have a n elegant way to upload documents to S3 and store with the S3Object. I thought the tools could do this uploading and seen some notes that suggest it may be possible.

Ideally I wish to;

  • update a document (of any type)
  • keep a record of the document in the GraphQL database
  • manage access permissions through my app depending on whether the cognito user belongs to a group that can access the document - Was intending to store the groups in the GraphQL database

My initial research seemed to indicate that Amplify.API should be able to do this but finding examples I can build upon appears to be very difficult.

You mentioned that there is an Angular specialist - Are there any links to any workshops, videos, example codes which could be shared.

Thanks,

Paul

Collapse
 
dean177 profile image
Dean Merchant

It looks like its possible for the file upload to S3 to succeed, but the graphql Mutation to fail. How do you deal with the zombie files?

Collapse
 
n1ru4l profile image
Laurin Quast

You should check out Object Expiration for S3. As part of the mutation, you could then remove the Object expiration, or copy the file to another "persistent" bucket.

Collapse
 
dean177 profile image
Dean Merchant

Thanks Laurin

Collapse
 
shikari1893 profile image
SHIV SHAKTI

Hi Nadar,

I need your help as I need to submit the Image format in a byte from Graphql schema and need to submit this image in Dynamo DB.

type S3Object {
bucket: String!
key: String!
region: String!
}

input S3ObjectInput {
bucket: String!
region: String!
localUri: String
mimeType: String
}

can you please tell me the mutation for this as how it works ?

Collapse
 
anujit87 profile image
Anujit Dash

Hi Nader,

I followed the tutorial and able to upload a video file to the s3 bucket using the private access. However when i tried doing the same thing next day, i am getting credentials Error. Any idea what might have happened.

AWSS3Provider - error uploading CredentialsError: Missing credentials in config

Collapse
 
younesh1989 profile image
Younes Henni

Much needed tutorial. Thanks a lot Nader.

Collapse
 
purushottam858 profile image
purushottam858

Can I get the code in python 3...?

Collapse
 
bernhardsmuts profile image
BernhardSmuts

Thanks for the tutorial.

It always blows my mind how you're supposed to figure this out from barebones in the AWS and Amplify docs...

Collapse
 
blairtaylor profile image
blairtaylor

Should it not be:

type S3Object @model {
...
}

and

avatar: S3Object @connection

?