Menu
  • HOME
  • TAGS

How to find Unused Security Groups of all AWS Security Groups?

python-2.7,amazon-web-services,amazon-ec2,amazon-s3,boto

This is a slightly difficult request because Security Groups are used by many different resources, including: Amazon EC2 instances Amazon RDS instances VPC Elastic Network Interfaces (ENIs) Amazon Redshift clusters Amazon ElastiCache clusters Amazon Elastic MapReduce clusters Amazon Workspaces ...and most probably other services, too To obtain a list of...

Node application and file storage

node.js,express,amazon-s3,passport.js,web-frameworks

Express.js is as good as RoR or anything else. No further comments. Yes, absolutely. Your files must sit outside of the application. In almost all scenarios that require persistent storage, you're better off using the persistent and reliable storage solutions such as S3. And, shipping the secure files along...

AWS S3 The security of a signed URL as a hyperlink

security,hyperlink,amazon-s3,download

AWS Security Credentials are used when making API calls to AWS. They consist of two components: Access Key (eg AKIAISEMTXNOG4ABPC6Q): This is similar to a username. It is okay for people to see it. Secret Key: This is a long string of random characters that is a shared secret between...

Throttling S3 commands with aws cli

amazon-web-services,amazon-s3,aws-cli

I ended up using Trickle and capping download & upload speeds at 20,000 kb/s. This let me use my existing script without much modification (all I had to do was add the trickle call to the beginning of the command). Also, it looks like bandwidth throttling has been added as...

How to configure aws CLI to s3 cp with anonymous user

amazon-web-services,amazon-s3,aws-sdk

you probably have to provide an access keys and secret key, even if you're doing anonymous access. don't see an option for anonymous for the AWS cli. another way to do this, it to hit the http endpoint and grab the files that way. In your case: http://big-data-benchmark.s3.amazonaws.com You will...

How to download a json file from S3 via d3.json

javascript,json,d3.js,amazon-s3

You cannot do this because of the browser's 'Same Origin Policy' security model. You can achieve it without a proxy by using JSONP Have a look at the sample code in the above link for example...

Lifecycle policy on S3 bucket

amazon-web-services,amazon-s3

What you describe sounds normal. Check the storage class of the objects. The correct way to understand the S3/Glacier integration is the S3 is the "customer" of Glacier -- not you -- and Glacier is a back-end storage provider for S3. Your relationship is still with S3 (if you go...

Uploading image from canvas to S3 in Meteor

canvas,file-upload,meteor,amazon-s3

Slingshot is very much suited for it. You need to extract a Blob from your canvas (use this method) and then pass it to slingshot: function uploadImage(directive, canvas, type, callback) { var uploader = new Slingshot.Upload(directive); canvas.toBlob(function (blob) { blob.type = type; upload.send(blob, callback); }, type); return uploader; } uploadImage("myDirective",...

how to conditionally change commas into semicolon with spark-scala map / split

regex,scala,amazon-s3,apache-spark

Add fallowing map before splitting val lines = sc.textFile("s3://{bucket}/whatever/2015/05/*.*") //add the map .map(line => line.replaceAll("\",\"", ";")) .map(_.split(",")) .map(p=> zahiro(p(0),p(1),p(2),p(3),p(4),p(5),p(6),p(7))) ...

Which is a better way: retrieve images from AWS S3 or download it and store locally in a temp folder to be displayed?

objective-c,core-data,amazon-web-services,amazon-s3,awss3transfermanager

It depends on how you use them. If your app is going to retrieve the images similiar to instagram, or twitter, it's good to download them as the user requested the images via the app. If once the images are retrieved, the application going to use the images again and...

what is the nodejs package for s3 image upload

node.js,image,amazon-s3

I recommend using s3-uploader, it's flexible and efficient resize, rename, and upload images to Amazon S3.

Using s3 in a healthcare application, private links

ruby-on-rails,security,amazon-s3,privacy

From the Documentation,you should use one of Amazon's "canned" ACLs. Amazon accepts the following canned ACLs: :private :public_read :public_read_write :authenticated_read :bucket_owner_read :bucket_owner_full_control You can specify a the ACL at bucket creation or later update a bucket. # at create time, defaults to :private when not specified bucket = s3.buckets.create('name', :acl...

How to put object to S3 via CloudFront

java,amazon-s3,amazon-cloudfront

Data can be sent through Amazon CloudFront to the back-end "origin". This is used for using a POST on web forms, to send information back to web servers. It can also be used to POST data to Amazon S3. If you would rather use an SDK to upload data to...

The Mystical Ephemeral File System of Heroku is Not Letting Me Get Files from S3

ruby-on-rails,ruby,amazon-web-services,heroku,amazon-s3

With Heroku you don’t have a single app running, rather you have several dynos each with a copy of your code and running some aspect of your app, and each independent from the others. In particular each dyno’s file system is separate from the others. In your case you push...

Set S3 Buckets defaults before upload

amazon-web-services,amazon-s3,cache-control

There is no concept in S3 of object defaults at the bucket level. To the extent a 3rd party tool emulates this, they are doing exactly what you will need to do -- specifying the desired values with each upload of each new object.

Updating AWS S3 expiration time

amazon-web-services,amazon-s3

I indeed have the right answer. The below Python code likely does the same trick of copying the BUCKET/KEY onto itself and would also reset the expiration. AWS S3 rounds up the expiry time to the same time based on a 24hr clock so any copying done on the same...

Use TfileUnarchive on Amazon S3

amazon-web-services,amazon-s3,talend

The trick here is to break the file retrieval and potential unzipping into one sub job and then the processing of the files into another sub job afterwards. Here's a simple example job: As normal, you connect to S3 and then you might list all the relevant objects in the...

Image Upload Strategy with Clusters And Amazon S3

php,image,amazon-s3

This is all about how you display the images. Let's say an image has been uploaded and you stored the record about it in some shared storage (like DB), you saved the image id and the node specific url where the image was temporary placed. I hope you can access...

How do I use a v2 authentication header with Amazon PHP SDK v3?

php,amazon-web-services,amazon-s3

Unfortunately, you can't. v2 signatures are being phased out and are only supported on a small number of services at this time, and S3 is not one of them. As the PHP SDK is built against AWS offerings, the authentication changes made to the services are reflected in the PHP...

Amazon S3 Download: Direct iOS or Web Service Node Js.?

ios,mysql,node.js,amazon-s3

Good question. Original Idea I would suggest setting up a redirect on your node.js server. So the device hits your server and gets the redirect and downloads from AWS. I believe you could do this for uploads too [EDIT: Nope upload wouldn't work because redirects would cause the POST to...

S3 allow only encrypted policy doesn't work

amazon-s3

It's possible that you cannot specify SSE as part of the PUT, or the condition is not available on a pre-signed URL since it only look at the headers. I tested by uploading from a form (using a POST) and the policy kicks-in to block the upload because it is...

AWS: How do I support more than 2 policies in IAM?

amazon-web-services,amazon-s3

There is more than one kind of policy in AWS. The managed policies are a layer added on top of the existing inline policies which can be attached to a user or group. The existing IAM policy documentation is here: Permissions and Policies Managed Policies and Inline Policies To read...

Is there a way to use http link instead of https in S3 getSignedUrl?

node.js,amazon-s3

To disable SSL download, set false in AWS config. AWS.config.update({ accessKeyId: key, secretAccessKey: secret, sslEnabled: false }); ...

Delete an S3 Bucket that has some data archived in Glacier

amazon-s3,amazon-glacier

You can't delete a bucket that is not empty, so you'll need to delete everything stored in the bucket, including what's stored in Glacier, first. If everything in Glacier was migrated to the glacier storage class over 3 months ago, then you should not incur any charges. If you don't...

Why are the object values getting pushed into the array 3 times?

javascript,jquery,arrays,amazon-s3

You are pushing the same object into the array in each iteration of: data.Contents.forEach(function(content,contentIndex){...}); So, as many times as that .forEach() loop iterates, you end up pushing the exact same bucketObj object into the csvBucketArr array. If you want each iteration of that .forEach() to put a new and different...

Get files from amazon s3 buckets sub folder .Net

.net,amazon-web-services,amazon-s3

Just set your bucketname as so: bucketname/foldername. So If you have a bucket called 'MyBucket' and a folder within that bucket called MyFolder, than you would do: var objRequest = new GetObjectRequest { BucketName = "MyBucket/MyFolder", Key = o.Key }; ...

Understanding server architecture: Delivering content from AWS S3 using Nginx reverse-proxy or Apache server

apache,amazon-web-services,nginx,amazon-s3

Does setting Access Control Level to S3 bucket to ALL_USERS ( to authenticated and anonymous users) compromise on data privacy? Absolutely. Don't do it. If I use reverse proxy, there is no way for the user to determine S3 bucket urls. Is the data safe and private? Theoretically, they...

Why does my image url from Amazon S3 have AWSaccesskey and expiration even though I made the bucket public?

ruby-on-rails,amazon-s3,carrierwave

Turns out I need to change the acl in the configuration file from config.aws_acl = :'public-read' to config.aws_acl = :public_read even though the documentation said to use :'public-read'....

Creating amazon aws s3 pre signed url PHP

php,amazon-s3,pre-signed-url

Well, if anyone else has any trouble with this like I did, here is the answer, I went into the amazon php development forums and got help from the profesionals. It seems you may be flip-flopping between Version 2 and Version 3 of the SDK or looking at the wrong...

Upload file amazon s3 pre signed post

ruby-on-rails,amazon-web-services,amazon-s3

Ok, I found it! it was reaally stupid I had an extra s3 in s3-eu-west-1.s3.amazonaws.com/my_bucket the correct way is: s3-eu-west-1.amazonaws.com/my_bucket

Access file on S3 with GoodData File-Download?

amazon-s3,gooddata

Which component are you using? FileDownload? If so, this works for me (I have just tried it): https://${S3AccessKey}:${S3SecretKey}@${BucketName}.s3.amazonaws.com/folder/folder/filename.csv Make sure that the file on S3 really exists on that path (check it by connecting there using some thirdparty tool like s3cmd or cyberduck) If still no luck, please add some...

How to upload a photo in Meteor to S3 and have it sync to database item?

mongodb,file-upload,meteor,amazon-s3

You definitely need to add the S3 url to your collection schema, this can be done upon receiving the url in the file upload callback using Mongo.Collection.update. [...] var appetizerId = Appetizers.insert({ name: addNewAppetizerVar, description: addNewAppDescVar }); var files = $("input.file_bag")[0].files; S3.upload({ files: files, path: "subfolder" }, function(error, s3Url){ Appetizers.update(appetizerId,...

Unauthenticated bucket listing possible with boto?

amazon-s3,boto

The docs don't mention it, but after digging into the code I discovered a hidden kwarg that solves my problem: conn = boto.connect_s3(anon=True) Then you can call conn.get_bucket() on any bucket that is publicly readable....

Reading many small files from S3 very slow

amazon-web-services,amazon-s3,hive,apache-pig,elastic-map-reduce

You can either : use distcp to merge the file before your job starts : http://snowplowanalytics.com/blog/2013/05/30/dealing-with-hadoops-small-files-problem/ have a pig script that will do it for you, once. If you want to do it through PIG, you need to know how many mappers are spawned. You can play with the following...

Weighted round robin dns between 2 Cloudfront distributions

amazon-web-services,amazon-s3,amazon-cloudfront,amazon-route53

Unfortunately what you are trying to do is not possible. CloudFront, or any HTTP server for that matter, only see's the host header of test.example.com. It has no idea how you got there, be it WRR DNS or hosts file, it only see's the host header. I'm not sure how...

Django and S3 - static URL won't change

python,django,amazon-s3,boto,django-storage

I found the solution. All I had to do was to set this in my settings: import boto.s3.connection AWS_S3_CALLING_FORMAT = boto.s3.connection.VHostCallingFormat() ...

How to upload S3 files in KeystoneJS

node.js,express,amazon-s3,keystone.js

I found out how, you can use the updateHandler that comes with Keystone. They're still using req.files form express 3.x though. // A express file generator function writeToFile(fileName, txt, ext, callback) { var rndm = crypto.randomBytes(20).toString('hex'), file_path = '/tmp/css_temp_' + rndm + '.' + ext, the_file = {}; fs.writeFile(file_path, txt,...

Update AWSS3 with Cocoapods

ios,amazon-web-services,amazon-s3

You are using the version 1 of the AWS Mobile SDK for iOS. We officially started supporting CocoaPods with the version 2 of the SDK, and AWSS3 is used to pull down only the version 2 of the SDK. You cannot use CocoaPods to install the AWS Mobile SDK until...

Upload to S3 return Forbidden

javascript,node.js,amazon-s3,sails.js

If console.log(err) is indeed null as you mentioned in the comments, it's quite unlikely that it's a problem with s3, as there's no reason for sails to return a 403 if this passes. Instead I'd suggest to check if fileUpload is ever called. If it is not, make sure the...

How to transfer files from iPhone to EC2 instance or EBS?

ios,iphone,amazon-ec2,amazon-s3,amazon-ebs

Allowing an app to write directly to an instance file system is a non starter, short of treating it as a network drive which would be pretty convoluted, not to mention the security issues youll almost certainly have. This really is what s3 is there for. You say you are...

Upload zip files to service for user download

node.js,file-upload,express,amazon-s3,zip

you can use the child_process module to run a zip command in the background (this example is for say if you're on Linux, you can modify it to suit windows) This just handles the zip process, then you can respond with the link to download the file: var exec =...

mkey of a File on AWS

android,amazon-web-services,amazon-s3

mkey is the identifier of the object stored under a bucket. It's equivalent to a file name on your local storage. Since you want to upload a file, so you have to give it a name (mkey in S3), basically anything you want to name it. You will address the...

Why is this python boto S3 multipart upload code not working?

python,amazon-web-services,amazon-s3,multiprocessing,boto

If it fits your use case, you may want to use the AWS Command-Line Interface (CLI), which can automatically use multi-part upload for you. aws s3 cp file.txt s3://bucket/file.txt...

Accessing a s3 bucket from Node

node.js,amazon-s3,elastic-beanstalk,passport.js

Making my comment into an answer since this led you to the solution. You should rewrite your routing like this: app.get('/secure/*', function(req, res){...} Hence every request start with /secure/ will be attach to this router. And then you can extract the path by access req.originalUrl ....

How to turn an s3 object string into something useful when using laravel 5.1 filesystem

php,amazon-s3,laravel-5,file-conversion,flysystem

Okay, so I made things work, but I am in no way sure if this is the smartest way. But hey, it's a step of the way. Note this solution lets anyone, authenticated or not, access your s3 objects url. I haven't figured out how to control access yet. Useful...

AWS::S3::Errors::NoSuchKey: No Such Key error

ruby-on-rails,ruby,amazon-web-services,amazon-s3

assuming awssdk v1 for ruby. small = S3_BUCKET.objects[small_path] does not actually get any objects. from: https://docs.aws.amazon.com/AWSRubySDK/latest/AWS/S3/Bucket.html bucket.objects['key'] #=> makes no request, returns an S3Object bucket.objects.each do |obj| puts obj.key end so you would need to alter your code to something like: to_delete = [] S3_BUCKET.objects[small_path].each do |obj| to_delete << obj.key...

ArgumentError - unknown SSL method `TLSv1_2'

ssl,amazon-s3,carrierwave,fog

Instead of setting it inside the fog_credentials hash, try setting it afterwards on config itself with the following 2 lines: config.fog_authenticated_url_expiration = 600 config.fog_attributes = { ssl_version: :TLSv1_2 } ...

AmazonS3Exception: x-amz-website-redirect-location header is not supported for this operation

android,amazon-web-services,amazon-s3

The x-amz-website-redirect-location metadata is for static website only. Static website has a different endpoint <bucket-name>.s3-website-<AWS-region>.amazonaws.com. See http://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html for more information. Do you mind explaining the purpose of setting it? I'll see if there is an alternative.

Creating xsd document from file download

python,amazon-s3,xsd,lxml

Use .content which is type bytes >>> from lxml import etree >>> xsd_url = 'https://s3-us-west-1.amazonaws.com/premiere-avails/movie.xsd.xml' >>> node = etree.fromstring(requests.get(xsd_url).content)) The problem is that your xml file specifies an encoding, and therefore it is the xml parser's job to decode this encoding. But your code uses .text, which asks requests to...

jQuery file upload to S3 (and rails) with CORS headers

jquery,ruby-on-rails,amazon-s3,jquery-file-upload

To allow any subpath on https://example.com you would do: <CORSRule> <AllowedOrigin>https://example.com</AllowedOrigin> </CORSRule> The origin header field can either be a wildcard (*) or a url containing one or no wildcard. Ex: https://*.example.com However adding the wildcard on the end https://0.0.0.0:3000/* will not allow any path as one could guess....

Amazon CloudFront: How to get monthly cost breakdown per distribution?

amazon-web-services,amazon-s3,amazon-cloudfront

The last field of the Detailed billing report is ResourceId, which appears to contain the distribution ID or alias for the distribution. I've seen both "ID" and "alias" listed for the same distribution, so I don't know why.

Amazon S3 browser direct upload unique file name

angularjs,node.js,amazon-s3,loopbackjs

The resulting LoopBack.io remote method now looks like below, setting 'Key' in the parameters did the trick. Project.signS3 = function(filename, cb){ var aws = require('aws-sdk'); var AWS_ACCESS_KEY = process.env.AWS_ACCESS_KEY; var AWS_SECRET_KEY = process.env.AWS_SECRET_KEY; var S3_BUCKET = '...'; aws.config.update({ accessKeyId: AWS_ACCESS_KEY, secretAccessKey: AWS_SECRET_KEY, region: 'eu-central-1', signatureVersion: 'v4' }); // Figure out...

S3 bucket policy, how to ALLOW a IAM group from another account?

amazon-web-services,amazon-s3,policy,bucket

IAM groups are not valid principals in S3 bucket policies. See this AWS forum post and this SO post for more discussion.

IOS with Rails Backend Amazon S3 direct upload

ios,ruby-on-rails,amazon-web-services,amazon-s3

If you are using your server to generate temporary credentials for the AWS Mobile SDK, we recommend the following approach: Generate the access key, secret key, and session token on your server. You have many language options including Java, .NET, PHP, Ruby, Python, and Node.js. Implement your credentials provider by...

Django ImageField url slow when using Amazon s3

python,django,heroku,amazon-s3,boto

You can create a new field in your models, for example, image_url. class YourModel(...): image_url = models.CharField(...) # other fields When the image is uploaded/saved the first time, retrieve its URL and populate image_url field with this value. You'll need to save your model again, though. You can use this...

Unable to access files from public s3 bucket with boto

python,amazon-web-services,amazon-s3,boto

The policy you have allows anonymous users access to the bucket. However, in your case Account B is not an anonymous user, it is an authenticated AWS user and if you want that user to have access you would need to grant it explicitly in the policy. Or, you can...

React-native upload image to amazons s3

javascript,file-upload,amazon-s3,fetch,react-native

multipart/form-data support for React Native (via the XHR FormData API) for mixed payloads (JS strings + image payloads) is in the works. It should land in GitHub soon.

Get Request Fulfilled Response from Amazon S3

java,amazon-web-services,amazon-s3

Find out that there is no way to get the HTTPStatus Code for successful response. Just Listening to exceptions is enough to be sure of the succeeded upload .

How to install SSL on CloudFront correctly?

ssl,amazon-s3,ssl-certificate,amazon-cloudfront

The certificate chain file is a "chain" of trust. It is a combination of the contents of all (usually) of the provided *Trust*.crt files, and they need to be combined in a specific order, including the begin/end lines found in each file. All of the the .crt files have a...

AWS S3 utilizing for website static images

php,amazon-web-services,amazon-ec2,amazon-s3

If you're using Apache then you could proxy all the /images URL's using Apache's built-in proxy support. You would want to add something like this to your Apache configuration: ProxyRequests Off <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPass /images/ http://images.s3-website-us-east-1.amazonaws.com/ This actually creates a reverse proxy. When the...

How to route traffic by proximity from Route 53 to closest NGINX server?

amazon-web-services,amazon-ec2,amazon-s3,cloudflare

you want geolocation routing for the api endpoints: http://docs.aws.amazon.com/Route53/latest/DeveloperGuide/routing-policy.html#routing-policy-geo...

Failed to Create preset with Amazon Elastic Transcoder

amazon-web-services,amazon-ec2,amazon-s3,amazon,boto

You can write this using unpacking arg list feature in Python: preset_h264_480p_100kbs_mp4_command={ "name":"preset","description": "preset", "container":"mp4","video": {"Codec":"H.264", "CodecOptions":{"Profile":"baseline", "Level":"3", "MaxReferenceFrames":"3"}, "KeyframesMaxDist":"200", "FixedGOP":"false", "BitRate":"600", "FrameRate":"10", "Resolution":"640x480", "AspectRatio":"4:3" },"audio": {"Codec":"AAC","CodecOptions":{"Profile":"AAC-LC"}, "SampleRate":"22050", "BitRate":"32", "Channels":"1" }, "thumbnails":...

Add IP Address to Amazon Web Services (AWS) PEM Key

amazon-web-services,amazon-s3,ip,ip-address,pem

If you are talking about logging on to AWS EC2 instance using SSH credentials using the same key pair that you used while on Office Network, I assume that you have created restricted inbound access to your instance. If this is the issue, to correct this, go to 'Security Group'...

S3Client PHP SDK: Object of class could not be converted to string

php,amazon-web-services,amazon-s3

The argument to S3Client::factory() is supposed to be an array. You're giving it a filename that contains PHP code to return the array, but S3Client doesn't run the file. Try changing the file to: <?php $s3options = array( 'includes' => array('_aws'), 'services' => array( 'default_settings' => array( 'params' => array(...

Should I instantiate an object every request or once upon app launch?

ruby,amazon-web-services,amazon-s3,rack,aws-sdk

Cognito is most useful when you delegate to your end users obtaining credentials and making calls to AWS themselves, so it's not usual to need Cognito in the server side. Edit: If you want to implement developer authenticated identities, then it definitely makes sense to use a Cognito service client...

Serve private mapping from S3 tiles by proxying data or signing urls through heroku?

heroku,amazon-s3,mapping,leaflet,cesium

If the content in S3 is private, you are going to have to authorize the download one way or another, unless the bucket policy allows the proxy to access the content without authentication based on its IP address. Even then, the proxy still needs to verify that the user is...

AWS S3 object listing

javascript,node.js,amazon-web-services,amazon-s3

Folders are illusory, but S3 does provide a mechanism to emulate their existence. If you set Delimiter to / then each tier of responses will also return a CommonPrefixes array of the next tier of "folders," which you'll append to the prefix from this request, to retrieve the next tier....

Update S3 file remove meta data

amazon-s3

It can't be fixed, because the behavior you describe isn't actually broken -- it's by design. The following may sound like a word game, but the finer points of the terminology explain the issue. Objects can't be "updated" in S3 -- they can only be overwritten. When you upload a...

Storing user submitted images

node.js,file-upload,amazon-s3,amazon-cloudfront

i think you can use knox for you task, for example. First you need create "class" with s3 api. Do this with knox: var s3 = function(args){ this.client = knox.createClient({ key: '<api-key-here>' secret: '<secret-here>' bucket: 'learnboost' }); } s3.prototype = { getImage: function(userId, imageId){ var d = Q.defer(); var image...

Call to S3Client::setRegion() fails

php,amazon-web-services,amazon-s3,aws-php-sdk

setRegion isn't a supported method in the version of the AWS SDK you're using. (The docs you linked to are for v2 of the SDK.) You can create a new client and pass in the region in the constructor, e.g., new Aws\S3\S3Client(['version' => $s3->getApi()->getApiVersion(), 'region' => $loc])....

Infinite loop when streaming a .gz file from S3 using boto

python,amazon-s3,gzip,boto

Ah, boto. The problem is that the read method redownloads the key if you call it after the key has been completely read once (compare the read and next methods to see the difference). This isn't the cleanest way to do it, but it solves the problem: import boto import...

Naked Root Domain Hosting

redirect,amazon-web-services,amazon-s3,amazon-route53

Set up the bucket, named for your domain name. The bucket name has to match exactly the address that shows in the browser's address bar. Then, create an A record at the root (apex) of the domain in Route 53, leaving the hostname empty, selecting "Yes" for Alias, then select...

An illustration of AWS hosted zones and buckets

redirect,amazon-web-services,amazon-s3,amazon-route53

I would recommend: In Amazon S3, create a bucket named mysite.com In Amazon Route 53, create a Hosted Zone for mysite.com (and of course, purchase the domain name, or point the current domain name to Route 53) In Route 53, create an A Record for the apex of mysite.com using...

Is it impossible to use AWS CloudFront for downloading my private image on S3?

ios,amazon-web-services,amazon-s3,amazon-cloudfront,aws-sdk

If you want to serve private content via CloudFront, you will find this doc useful http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/PrivateContent.html. Currently the AWS SDK for iOS doesn't support CloudFront directly. You need to manually sign the request. A related question is asked on AWS forum https://forums.aws.amazon.com/thread.jspa?messageID=336955.

Secure file upload directly to s3 or server to s3 (from iOS app) [closed]

ios,node.js,amazon-web-services,express,amazon-s3

You want to choose option 2, have your user upload directly to S3. If you use option 1, you have the possibility of your server going away before it can complete the upload to S3 (think autoscaling, where an instance is taken out of service before it can complete). And...

MalformedXML when tagging an S3 bucket

java,amazon-s3

There's a small mismatch between the Java API and the REST API. Have a look at the PUT Bucket Tag request. It's not explicitly stated, but it is implied that you can only have one TagSet per Tagging collection. BucketTaggingConfiguration, however, allows a list of TagSets. Instead of adding a...

How to limit access in Amazon S3 files to specific people?

ruby-on-rails-4,amazon-s3

It is not recommended to use AWS Identity and Access Management (IAM) for storing application users. Application users should be maintained in a separate database (or LDAP, Active directory, etc). Therefore, creating "one bucket per group" is not feasible, since it is not possible to assign your applications users to...

EC2 can't access S3 file

django,amazon-ec2,amazon-s3,access-denied

What permissions are set? If you're accessing the Url for the image whilst logged in via the admin panel then you have full permissions to view anything/delete anything etc. If the image url is used in a web site you need to make sure that the S3 object has an...

Amazon Redshift: Copying Data Between Databases

postgresql,amazon-web-services,amazon-s3,amazon-redshift,amazon-data-pipeline

There is no way to access table from two different database at same time in query. You should unload data from on database using unload command to s3 and then load it new database table using copy commands....

Laravel 5.1 AWS S3 Flysytem: AWS HTTP error: cURL error 6: Couldn't resolve host name

curl,file-upload,amazon-s3,laravel-5,host

Typical. Realized what was wrong after reading this article by Paul Robinson. I had set my s3 region to be Frankfurt. While my region sure enough is Frankfurt, I needed to refer to it as eu-central-1 as my s3 region in config/filesystems.php. After that I could go on to fix...

Keep getting function error not sure why

android,node.js,amazon-s3,knox-amazon-s3-client

You're passing a String as the first argument to s3.putStream where a Stream is expected. See https://github.com/Automattic/knox/blob/master/lib/client.js#L398 You should be using it like so: s3.putStream(inputStream, videoID, s3Headers, function(err, s3response) { You may be able to populate inputStream by reading the content of req.mediaFile as a stream, if possible. It's not...

AWS S3 Upload - Using AccessKey, SecretKey and SessionToken - iOS SDK

ios,amazon-web-services,amazon-s3

If you want to use temporary credentials generated on your server, you need to implement your own credentials provider. I recommend the following approach: Generate the access key, secret key, and session token on your server. You have many language options including Java, .NET, PHP, Ruby, Python, and Node.js. Implement...

Predetermining number of partitions of RDD

amazon-s3,apache-spark,hdfs

The partitions is dependent on the file type. In your case, since it is an HDFS file, then the default number of partitions is the number of input splits and that will depend on your hadoop setup. But if all you want is a way of understanding how this works....

“remoteContext object has no attribute”

amazon-s3,apache-spark,pyspark

You've done a great job getting your data mounted into dbfs which is great, and it looks like you just have a small typo. I suspect you want to use sc.textFile rather than sc.textFiles. Best of luck with your adventures with Spark.

Writing an IAM policy and CORS configuration for Amazon S3

ruby-on-rails,amazon-web-services,amazon-s3,cloud,amazon-iam

I've grown white hairs trying to figure out the proper configuration. Here's one that's working for me: { "Statement": [ { "Sid": "AllowPublicRead", "Action": [ "s3:ListBucket", "s3:GetObject", "s3:PutObject", "s3:PutObjectAcl", "s3:DeleteObject" ], "Effect": "Allow", "Resource": [ "arn:aws:s3:::mybucket/*", "arn:aws:s3:::mybucket" ] } ] } But you also need to set your CORS configuration....

Conditional Resizing using cfs:graphicsmagick

javascript,node.js,amazon-web-services,meteor,amazon-s3

Append '>' for maximum width/height '^' for minimum width/height var profileStore = new FS.Store.S3("profileImages", { accessKeyId: "--KEY--", secretAccessKey: "--KEY--", bucket: "meteor-intrepid", folder: "profiles", transformWrite: function(fileObj, readStream, writeStream) { gm(readStream, fileObj.name()).resize('300>').stream().pipe(writeStream); } }); Excerpt from GraphicMagick doc: By default, the width and height are maximum values. That is, the image is...

List objects in Google Cloud Storage using the S3 interop API

amazon-s3,google-cloud-storage

To list the objects in a bucket with the XML API you use: GET bucket as documented at https://cloud.google.com/storage/docs/reference-methods#getbucket This is the same interface as the S3 bucket listing API: http://docs.aws.amazon.com/AmazonS3/latest/API/RESTBucketGET.html...

Copy error Amazon Redshift loading from S3

amazon-web-services,amazon-s3,amazon-redshift

change s3://s3-ap-southeast-1.amazonaws.com/dwh-dimensions/dim-products/dim_products.csv to s3://dwh-dimensions/dim-products/dim_products.csv...

Distributing installer of Windows desktop application

windows,deployment,amazon-s3,installer,hosting

Big companies like Microsoft use a content delivery network which makes sure no matter where you come from a download server gets assigned to you which is as near as possible to your current location.

Move files from EC2 to S3 and then delete from EC2

amazon-web-services,amazon-ec2,amazon-s3,aws-php-sdk

I don't see what OS your current ec2 instance is running. But if it is linux you could use S3fs https://github.com/s3fs-fuse/s3fs-fuse/wiki/Fuse-Over-Amazon that will allow you to mount your bucket like a local drive/folder. Then you can simple move the files there. It will upload them to the bucket in the...

How to create Datasource through AWS Machine Learning SDK

java,amazon-web-services,amazon-s3,aws-sdk

filename- dataset.schema (it is mandatory to have .schema as extension of schema file) { "version": "1.0", "targetAttributeName": "A5", "dataFormat": "CSV", "dataFileContainsHeader": false, "attributes": [ { "attributeName": "A1", "attributeType": "TEXT" }, { "attributeName": "A2", "attributeType": "NUMERIC" }, { "attributeName": "A3", "attributeType": "CATEGORICAL" }, { "attributeName": "A4", "attributeType": "TEXT" }, { "attributeName":...

Can't access s3 buckets after creating instance using IAM profile

ruby,amazon-web-services,amazon-s3,boto,iam

As well as an access key and a secret access key, temporary credentials such as the ones provided by instance metadata also have a session token - without the token the credentials are invalid. Current versions of fog / fog-aws support fetching instance credentials for you, eg storage = Fog::Storage::AWS.new(region:...

s3cmd not working as cron-task when echos/dates are added

amazon-web-services,amazon-s3,cron,cron-task,s3cmd

s3cmd uses a configuration file located at ~/.s3cfg. It's probably having trouble picking that up. Pass in --config=/home/username/.s3cfg and see if that helps. In any case, s3cmd isn't consistently maintained. The official commandline client (aws-cli) is much better in many ways. edit: use this as your .sh file, make sure...