如果您在家中的本地网络上有数百 GB 甚至 TB 的数据,您可能会将其全部存储在计算机、外部硬盘驱动器或NAS(网络附加存储)设备上。备份数据非常重要,但将它们全部放在一个地方绝不是一个好主意。
当我看到我的本地NAS上存储了超过 2 TB 的照片、视频、备份等时,我自己意识到了这一点。当然,它有 4 个硬盘驱动器,如果其中一个出现故障,我的任何数据都不会丢失。但是,如果我的房子被烧毁或被淹,一切都将与NAS一起丢失。所以我决定将数据备份到云端。
在最终选择 Amazon S3 之前,(Amazon S3)我查看了Dropbox、SkyDrive、Google Drive、CrashPlan 和 Amazon S3(CrashPlan and Amazon S3)和Glacier。为什么选择亚马逊(Amazon)?好吧(Well),他们有一项很酷的服务,您可以在其中发送最大 16 TB 的外部硬盘驱动器(hard drive)并将其直接上传到他们的服务器,从而绕过尝试通过缓慢的Internet 连接(Internet connection)上传数据的巨大问题。
在我家附近有 AT&T,我的上传速度高达 1.4 Mbytes/sec上传我存储在NAS上的 2.5 TB 数据需要几个月的时间。使用Amazon Import/Export,您可以支付 80 美元的服务费(service fee),并让他们在一天内为您上传所有数据。我最终制作了一个视频教程,引导您完成从注册(video tutorial)Amazon Web Services到打包硬盘并将其运送到Amazon的整个过程。
以下是视频的完整记录:
大家好(Hey everyone)。这是来自Online Tech Tips的(Online Tech Tips)Aseem Kishore。我今天要做一些新的事情。我将制作有关Amazon Web Services导入导出功能的(Import Export feature)视频教程(video tutorial)。那么什么是导入导出功能(Import Export feature)?嗯,它基本上是一种将大量数据放入Amazon S3 存储桶(Amazon S3 bucket)或Glacier 保险库(Glacier vault)的方法。Amazon S3 和 Glacier(Amazon S3 and Glacier)基本上是您使用Amazon进行数据(Amazon)备份和数据(backup and data)归档的两个存储选项。那么你为什么要使用亚马逊(Amazon)的这项服务呢?
好吧,它基本上可以让您非常快速地将大量数据移动到云中。(Cloud)如果您像我一样,您可能会在本地计算机或外部硬盘驱动器上存储数百 GB 的照片和视频。尝试将 100 GB 或 500 GB 甚至 1 TB 的数据上传到云中,如果(Cloud)上传连接(upload connection)缓慢,则需要数周甚至数月的时间。相反,您可以做的是将这些数据复制到最大 16 TB 大小的外部硬盘驱动器上,然后将其运送到亚马逊,他们会将其带到他们的数据中心并直接上传(center and upload)到您的存储桶或保险库(bucket or vault),然后您可以继续从网络上访问它。
因此,要开始使用,您要做的第一件事就是创建一个Amazon Web Services 帐户(Amazon Web Services Account)。为此,您将访问 aws.amazon.com,然后单击“注册”按钮(Sign Up button)。如果您还没有亚马逊账户(Amazon account),请继续输入您的电子邮件地址,然后选择“我是新用户” 。如果您这样做,请继续并选择“我是回头客”,然后您就可以使用您当前的Amazon 帐户(Amazon account)注册Amazon Web Services了。
一旦您创建了您的Amazon Web Services 帐户(Amazon Web Services Account),您将不得不下载Import Export 工具(Import Export tool)。这个工具使用起来非常简单。它确实需要一些配置,我将继续进行解释。但是您可以在屏幕上看到,有一个下载链接(load link),我将在此视频底部的标题中添加该链接。因此,继续下载它,然后将其解压缩到您计算机上的目录中。
现在您已经下载并解压了该工具,您应该有一个如下所示的目录。此时,我们需要编辑一个名为“AWS Credentials”的文件。这包含两个值,Access Key ID 和 Secret Key(Access Key ID and Secret Key)。基本上(Basically),这是亚马逊(Amazon)用来链接到您的帐户的两个值。您可以通过访问以下地址从您的Amazon Web Services 帐户(Amazon Web Services Account)获取这两个值。它是 aws.amazon.com/securitycredentials。在Security Credentials 页面上(Security Credentials page),您将继续并单击Access Keys。
现在这里有点混乱。如果您已经使用过Amazon Web Services并且过去已经创建了密钥,那么您将无法在此处看到您的密钥。这是来自Amazon(Amazon)的一种新界面,为了查看您现有的密钥,您必须单击一个安全凭证(Security Credentials)链接,该链接将您带到旧的Legacy 页面(Legacy page)。
如果您刚刚创建了一个新帐户,那么您将能够创建一个新的根密钥(root key)。此按钮将处于活动状态。此时,您将获得一个访问Key ID,并且您将获得密钥,以便它会为您提供这两个值。这是Legacy Security 页面(Legacy Security page),如果您已经为Amazon Web Services创建了(Services)访问密钥 ID(Access Key ID) ,您可以在其中访问您的密钥。如您在此处看到的,我有两个访问密钥,如果我想继续查看我的密钥,我可以继续并单击Show 按钮(Show button),然后我可以将这两个值复制到我已经显示的AWS Credentials 文件(AWS Credentials file)中你早点。所以你想继续粘贴 在此处访问 ID 密钥(Access ID key)并在此处粘贴密钥(Secret key)。
现在,如果您对Access Key ID和Secret Access key感到困惑,那没关系。你真的不需要知道他们是什么或以任何方式(way whatsoever)关心他们。您所要做的就是签署并获取值,然后将它们复制并粘贴到该文件中。
接下来我们要做的是创建导入作业(import job)。现在接下来的两个部分是整个过程中最难的两个部分。为了为Amazon S3创建(Amazon S3)导入作业(import job),我们将继续创建清单文件。这个清单文件基本上包含您设备上的一些信息。无论您想将数据存储在哪里,以及您希望设备运回哪里。
现在的好处是我们不必自己创建这个清单文件。它已经为我们创建好了,我们只需要继续填写它。所以你想要继续做的是进入目录,你有你的导入导出工具,然后单击(import export tool and click)示例。在这里,您将继续打开S3 导入(S3 import)清单。正如您在此处看到的,我已经为我的导入作业(import job)填写了信息。因此,让我们继续仔细研究一下。
如您所见,您要做的第一件事是再次输入您的访问密钥 ID。您必须去掉括号,然后直接将其粘贴在冒号之后。您要做的下一件事是输入存储桶名称(bucket name)。您将必须继续创建一个存储桶,我将在此之后继续显示它,但现在继续输入您想要的任何名称,您的数据将在哪里存储。因此,如果您创建一个名为Back Up的文件夹,那么您设备上的任何内容、任何文件夹或其中的任何内容都将位于该存储桶名称(bucket name)的下方。
您接下来要做的就是输入您的设备 ID。这基本上是您的外部硬盘驱动器的唯一标识符。这可以是硬盘驱动器背面的序列号。如果您的硬盘背面没有序列号,您可以继续创建自己的编号或创建标识符。只需(Just)将其写在某物上,即可以贴在设备上的贴纸上,然后在此处输入该值即可。它必须在设备上和此文件中相同。擦除设备(Erase device),它已经设置为否,所以你要离开它。你可以留下下一个。服务水平(Service level)是标准的,你可以离开。以及退货地址(return address),您将继续填写您的地址,就像我在这里所做的那样。在原始文件中,有一些可选字段。如果您不打算使用它们,则必须继续删除它们。所以你可以继续删除这些行。
好的(Okay),所以我们在填写清单文件后要做的下一件事是将其保存到适当的目录中。为此,我们将继续并单击File、Save As,然后我们将向上移动回到import export Web Services Tool directory。这也是我们之前填写的那个点属性文件的位置。(property file)在这里,您将继续并将文件命名为“my import manifest.txt ” 。” 由于您的另存为(Save As) 类型(Type)已经是 txt,因此您不必在文件名(file name)中输入它。继续并单击Save。
现在我们已经编辑了AWS Credentials 文件(AWS Credentials file)并记入了My Import Manifest 文件(My Import Manifest file),我们可以继续在Amazon S3中创建一个存储桶。这很简单。您要做的是访问 aws.amazon.com,然后单击My Account 控制台(My Account console),然后单击AWS 管理控制台(AWS Management Console)。登录后,您应该会看到一个类似这样的屏幕,其中包含所有不同的Amazon Web Services。此时,我们只关心Amazon S3,它位于左下方。单击(Click)它,它将继续加载S3 控制台(S3 console). 正如你在这里看到的,除了水桶之外,它并没有太多的东西。所以我有两个bucket,这是我synology nas的备份,是网络类型的存储设备(network type storage device)。
您想要继续做的是单击Create Bucket,并认为您将继续为您的存储桶提供存储桶名称(bucket name)。您也可以选择其他区域,但我建议您只需转到它自动为您填充的区域。存储桶名称(bucket name)只能包含点,并且在存储它的整个区域中必须是唯一的。因此,如果其他人已经拥有该存储桶名称(bucket name),它将给您一个错误。例如,如果我说 nasbackup,然后我说 create,它会给我一个错误,即请求的存储桶名称(bucket name)不可用。在这种情况下,您可以使用点,这样您就可以放置“点”以及您想要的任何其他内容,然后单击创建,如果这是唯一的,那么它会继续创建该存储桶名称(bucket name)。因此,您可以继续创建一个存储桶,即我们将存储所有外部硬盘驱动器上的数据。
此时,您可能想知道还需要做什么。那么让我们来看看到目前为止我们做了什么。我们注册了AWS 服务(AWS service)。我们已经下载并提取了该工具。我们已经编辑了文件和编辑器键(file and editor keys)。我们继续创建清单文件并将其保存在与凭证文件相同的目录中的导入清单中,并且我们在Amazon S3上创建了一个存储桶。因此,要完成这项工作,还需要做更多的事情。
接下来我们要做的是使用Java 命令(Java command)行工具创建(line tool)作业请求(job request)。这有点技术性,这可能是您将要做的最具技术性的事情,但实际上并不难。现在为了创建这个作业请求(job request),我们必须在命令提示符处运行(command prompt)Java 命令(Java command)。但为了做到这一点,我们必须安装Java 开发(Java development)工具包。这与Java 运行(Java runtime)环境不同,后者通常安装在大多数计算机上,但它不会让您在命令提示符下运行(command prompt)Java 命令(Java command)。
为了做到这一点,你要做的是去谷歌(Google)搜索Java SE,这是Java Standard Edition。继续并单击此处的第一个链接,这会将您带到此页面。在这里您可以向下滚动,您将看到三个选项,JDK server、JRE和JRE。我们不需要在这里担心这两个。我们将继续下载JDK。在下一页上,继续并单击,接受许可协议(Accept License Agreement),然后您可以下载与您的系统规格匹配的文件。就我而言,我下载了Windows 64位可执行文件。
现在您已经安装了Java可执行工具包,我们可以继续运行Java 命令(Java command),您可以继续在我在此处突出显示的文档中查看此命令。顺便说一句,如果您需要访问此文档,最简单的方法是去Google搜索“AWS import export docs ”。然后继续并单击创建您的导入作业(import job),然后单击创建您的第一个Amazon S3 导入(Amazon S3 import)作业,您将被带到此页面。
现在我们可以继续并通过转到命令提示符(command prompt)来运行命令。为此,我们单击Start,输入CMD 并按 Enter(CMD and press Enter)。现在我们有了命令提示符(command prompt),我们需要进入亚马逊导入(Amazon import) 导出工具(export tool)所在的目录。在我们的例子中,它位于Downloads中,然后有一个名为Import Export Web Service Tool的文件夹。因此,为了将目录导航到命令提示符(command prompt),您输入“cd”,然后我将输入“downloads”,然后我将再次输入“cd”,然后我将输入“导入导出(import export)网络服务工具”(service tool),” 这是目录(Directory)的名称。现在我在那个目录(Directory)中,我只是继续复制这个命令并将其粘贴(command and paste)到命令提示符(command prompt)中。
您可能已经注意到,在我们刚刚复制粘贴的命令中,清单文件的名称是My S3 Import Manifest.txt。我认为这是文档的问题,因为当我尝试以这种方式运行它时,我收到一条错误消息,指出该文件必须命名为My Import Manifest.txt。因此,只需移动光标并删除S3 部分(S3 part),您应该可以运行该命令。现在我不打算继续运行该命令,因为之前已经运行过它。但是,当您继续并按Enter 键(Enter)时,您应该会得到类似这样的信息:job created、job ID、AW shipping address 和Signature File Contents。
签名文件(signature file)内容基本上是在Import Export Web Services工具调用(tool call)Signatures下的根目录(root directory)中创建的文件。这将在您运行实际命令时创建。如果一切顺利,(Okay)您可以获取此文件,您将不得不复制到硬盘驱动器的根目录。
我们快到这里了。接下来我们要做的是将签名文件(Signature File)复制到硬盘驱动器的根目录。运行Java 命令后,我们可以在(Java command)Import Export Web Services Tool Directory中找到名为 Signature 的文件。
第二步到最后一步是打印装箱单并填写。这就是装箱单的样子。这是一个非常简单的文件。您继续输入日期、您的电子邮件帐户 ID、您的联系电话、您的姓名和电话号码(name and phone number)、工作 ID 以及您为设备输入的标识符。同样,您可以从文档中找到此文档。
最后一步是简单地打包您的硬盘并将(drive and ship)其运送到亚马逊(Amazon)。有几件小事是你需要注意的。首先(Firstly),你需要包括电源(power supply)和任何电源线和任何接口线,所以如果是USB 2.0,3.0(USB 2.0),esata,你需要包括USB线或esata线(USB cable or esata cable)。如果没有,他们会继续并把它还给你。您还必须填写我之前提到的装箱单并将其放入盒子中。最后,您要将包发送到您从我们运行的创建响应命令中获得的地址。(create response command)
发货时还有两件小事需要注意。首先,您要确保运输标签上(shipping label)有该工作 ID。如果没有,他们将把它退回。因此,您需要确保运输标签(shipping label)中有工作 ID 。其次,您还应该填写退货地址(return shipping address)。这将不同于我们在清单文件中放置的退货地址。(return shipping address)如果他们出于某种原因不处理您的硬盘驱动器,如果出现问题或(problem or something)类似情况,他们会将硬盘驱动器退回到运输标签(shipping label)上的运输地址(shipping address). 如果他们处理您的硬盘驱动器并且他们能够传输所有数据,他们会将硬盘驱动器退回到您在此人中的送货地址。(shipping address)所以在标签上注明退货地址也很重要。(return shipping address)您可以选择任何您喜欢的运营商。我选择了UPS。有跟踪号(tracking number)很好,他们可以毫无问题地为您完成所有这些工作。
就是这样。这是几个步骤,第一次这样做确实需要一点时间。但在那之后,它非常快,而且是将大量数据保存到云端(Cloud)的好方法,亚马逊(Amazon)的存储也很便宜。因此,如果您每天需要存储大量数据,并且希望将其备份到家中或外部硬盘驱动器以外的其他地方,那么Amazon Web Services S3是一个不错的选择。
我希望您喜欢本教程在线技术提示(Online Tech Tips)。请(Please)回来参观。
Transfer Data to Amazon S3 Quickly using AWS Import Export
If yоu have hυndreds of gigabytes or even terabytes of data on your local network at home, you probably have it all stored on a computer, an external hard drive or a NAS (netwоrk attached storage) device. Having backups of your data is extremely impоrtant, but haνing them all in one place is never a gоod іdea.
I realized this myself when I saw I have over 2 TB of photos, videos, backups, etc stored on my local NAS. Sure it has 4 hard drives and if one fails, none of my data will be lost. However, if my house burns down or gets flooded, everything will be lost along with the NAS. So I decided to backup the data to the cloud.
I checked out Dropbox, SkyDrive, Google Drive, CrashPlan and Amazon S3 and Glacier before finally settling on Amazon S3. Why Amazon? Well, they have a cool service where you can send in an external hard drive up to 16 TB in size and have it uploaded directly to their servers, thereby bypassing the massive problem of trying to upload that data over your slow Internet connection.
With AT&T in my neighborhood, I get a whopping 1.4 Mbytes/sec upload speed. It would take many months to upload the 2.5 TB of data I have stored on the NAS. With Amazon Import/Export, you can pay a $80 service fee and have them upload all that data for you in one day. I ended up making a video tutorial that walks you through the whole process from signing up for Amazon Web Services to packing your hard drive and shipping it to Amazon.
Here is the full transcript of the video:
Hey everyone. This is Aseem Kishore from Online Tech Tips. I’m going to be doing something new today. I’m going to do a video tutorial on Amazon Web Services Import Export features. So what is the Import Export feature?
Well it’s basically a way to get a large amount of data into an Amazon S3 bucket or into a Glacier vault. Amazon S3 and Glacier are basically two storage options that you have for data backup and data archiving with Amazon. So why would you want to use this service from Amazon?
Well, it basically lets you move a large amount of data into the Cloud very quickly. If you are someone like me, you might have hundreds of gigabytes of photos and videos stored locally on your computer or on an external hard drive. Trying to upload 100 gigabyte or 500 gigabyte or even a terabyte of data into the Cloud will take you weeks if not months on a slow upload connection. Instead what you can do is copy that data onto an external hard drive that can be up to 16 terabytes in size and just ship that to Amazon where they will take it to their data center and upload it straight to your bucket or vault and then you can go ahead and access that from the web.
So to get started, the first thing you are going to have to do is create an Amazon Web Services Account. To do that, you’re going to go to aws.amazon.com and you’re going to go ahead and click on the Sign Up button. Go ahead and type in your e-mail address and then select, “I am a new user,” if you do not have an Amazon account already. If you do, go ahead and select, “I am a returning user,” and you can use your current account Amazon account to sign up for Amazon Web Services.
Once you created your Amazon Web Services Account, you’re going to have to download the Import Export tool. This tool is very simple to use. It does take a little configuration, which I am going to go ahead and explain. But you can see on the screen, there is a down load link which I am going to add in the caption in the bottom of this video. So go ahead and download that and then extract that into a directory on your computer.
Now that you’ve downloaded that tool and extracted it, you should have a directory that looks like this. At this point, we will need to edit a file called, “AWS Credentials.” This contains two values, Access Key ID and Secret Key. Basically, these are two values that Amazon uses to link to your account. You can get these two values from your Amazon Web Services Account by going to the following address. It’s aws.amazon.com/securitycredentials. On the Security Credentials page, you’re going to go ahead and click on Access Keys.
Now it gets a little confusing here. If you’ve already used Amazon Web Services and have already created keys in the past, then you won’t be able to see your secret key here. This is kind of a new interface from Amazon and in order to see your existing secret keys, you have to click on a Security Credentials link that takes you to the old Legacy page.
If you just created a new account, then you’ll be able to create a new root key. This button will be active. At that point you’ll get an access Key ID, and you’ll get the secret key so that it will give you both the values. And this is the Legacy Security page where you can access your secret keys if you have already created an Access Key ID for Amazon Web Services. So as you can see here, I have two access keys and if I wanted to go ahead and see my secret key, I can go ahead and click the Show button and then I can copy those two values into AWS Credentials file that I had shown you earlier. So you want to go ahead and paste the Access ID key here and paste the Secret key here.
Now at this point, if you are getting confused by the Access Key ID and the Secret Access key, that’s okay. You really don’t need to know what they are or care about them in any way whatsoever. All you have to do is sign and, get the values, and copy and paste them into that file.
The next thing we’re going to go ahead and do is create import job. Now the next two parts are the two hardest parts of this whole procedure. In order to create an import job for Amazon S3, we’re going to go ahead and create a manifest file. This manifest file basically contains some information on your device. Wherever you want to store the data and where do you want the device shipped back to.
Now the nice thing is that we don’t have to create this manifest file ourselves. It’s already created for us, we just have to go ahead and fill it out. So what you’ll want to go ahead and do is go into the directory and where you have your import export tool and click on Examples. Here you are going to go ahead and open up the S3 import manifest. As you can see here, I’ve already gone ahead and filled out the information for my import job. So let’s go ahead and take a look at this a little bit more closely.
As you can see, the first thing you have to do is type in your access key ID again. You have to get rid of the brackets, and you just go ahead and paste it directly after the colon. The next thing you’re going to want to do is type in the bucket name. You’re going to have to go ahead and create a bucket, which I’m going to go ahead and show after this, but for now go ahead and type in whatever name that you will want where you’re data is going to be stored. So if you create a folder called Back Up, than anything that you have on your device, any folders or anything in there, will go underneath that bucket name.
The next thing that you will want to go ahead and do is type in your device ID. This is basically a unique identifier for your external hard drive. This can be the serial number that’s on the back of the hard drive. If you don’t have a serial number that’s on the back of your hard drive, what you can go ahead and do is just create a number of your own or create an identifier. Just write that on something, a sticker that you can put onto your device and then just type that value here. It just has to be something that is the same on the device and in this file. Erase device, it’s already set to No, so you are going to leave that. You can leave the next one. Service level is standard, you can leave that. And the return address, you’re going to go ahead and fill out your address like I’ve done here. In the original file, there are some optional fields. You have to go ahead and remove those if you’re not going to use them. So you can just go ahead and delete those lines out.
Okay, so the next thing we’re going to do after we fill out the manifest file is save it into the appropriate directory. To do that, we’re going to go ahead and click File, Save As, and we’re going to move up back into the import export Web Services Tool directory. This is also the location of that dot property file that we filled out earlier. Here you are going to have the go ahead and name your file, “my import manifest.txt. ” Since your Save As Type is already txt, you don’t have to type that into the file name. Go ahead and click Save.
Now that we’ve edited the AWS Credentials file and credited the My Import Manifest file, we can go ahead and create a bucket in Amazon S3. This is very simple to do. What you are going to go ahead and do is go to aws.amazon.com, and you’re going to go ahead and click on My Account console and then click on AWS Management Console. Once you log in, you should get a screen that looks like this with all the different Amazon Web Services. At this point, all we care about is Amazon S3, which is down here at the bottom left. Click on that, and it’s going to go ahead and load up the S3 console. And as you can see here, there’s not really much to it other than buckets. So I have two buckets, this is my backup of my synology nas, which is the network type storage device.
What you’ll want to go ahead and do is click Create Bucket, and think you’re going to go ahead and give your bucket a bucket name. You can also choose a different region, but I suggest you just go to the region that it populates for you automatically. The bucket name just can only have dots, and it has to be unique in that entire region where it’s being stored. So if somebody else already has that bucket name, it’s going to give you an error. For example, if I say, nasbackup, and I say create, it’s going to give me an error that the requested bucket name is not available. In that case you can use dots so you can put ‘dot’, and whatever else you want and click create, and if that’s unique, and then it goes ahead and creates that bucket name. So you can go ahead and create a bucket, that is we a data on all of that external hard drive is going to be stored.
At this point, you might be wondering what else has to be done. So let’s take a look at what we have done so far. We signed up for the AWS service. We’ve downloaded and extracted the tool. We’ve edited the file and editor keys. We’ve gone ahead and created the manifest file is saved it in the import manifest in the same directory as the credentials file, and we created a bucket on Amazon S3. So there’s only a couple more things to do to get this done.
The next thing we have to do is create a job request using a Java command line tool. This is a bit technical and this is probably the most technical thing that you are going to have to do, but is really not that hard. Now in order to create this job request, we have to run a Java command at the command prompt. But in order to do that, we have to have the Java development kit installed. This is different from the Java runtime environment, which is normally installed on most computers, but it won’t let you run Java commands at the command prompt.
In order to do that, what you’ll do is go to Google and just do a search for Java SE, and this is Java Standard Edition. Go ahead and click on the first link here and this brings you to this page. Here you could scroll down, and you’ll see three options, JDK server, JRE, and JRE. We don’t need to worry about these two here. We’re going to go ahead and download the JDK. On the next page, go ahead and click, Accept License Agreement and then you can download the file that matches your system specifications. In my case, I downloaded the Windows 64 bit executable file.
Now that you’ve install the Java executable kit, we can go ahead and run the Java command, and you can go ahead and see this command here in the documentation that I have highlighted here. And by the way, if you need to get to this documentation, the easiest way is to go to Google and do a search for “AWS import export docs”. And then go ahead and click on create your import job, and then click on create your first Amazon S3 import job, and you’ll be brought to this page.
Now we can go ahead and run the command by going to the command prompt. In order to do that we click on Start, type in CMD and press Enter. Now that we have a command prompt, we need to go into the directory where the Amazon import export tool is located. In our case, it’s in Downloads, and then there’s a folder called Import Export Web Service Tool. So in order to navigate directories into the command prompt, you type in “cd”, and then I’m going to type in “downloads”, and then I’m going to type in “cd” again, and I’m going to type in “import export web service tool,” which is the name of the Directory. Now that I’m in that Directory, I’m simply going to go ahead and copy this command and paste that into the command prompt.
You may have noticed that in the command we just copied and pasted, the name of the manifest file is My S3 Import Manifest.txt. I think this is a problem with the documentation because when I tried to run it this way, I got an error saying that the file had to be named My Import Manifest.txt. So simply move your cursor and delete the S3 part, and you should be able run the command. Now I’m not going to go ahead and run the command right now because of already run it before. But when you go ahead and press Enter you should get something like this, job created, job ID, the AW shipping address, and the Signature File Contents.
The signature file contents is basically a file that’s created in the root directory here under Import Export Web Services tool call Signatures. This will be created when you run the actual command. If everything goes Okay, you can then take this file and you’re going to have to copy onto the root of your hard drive.
We’re almost to the end here. The next thing we have to do is copy the Signature File to the root of the hard drive. We can find the file called Signature in the Import Export Web Services Tool Directory after you run the Java command.
The second to the last step is printing out the packing slip and filling it out. This is what the packing slip looks like. It’s a very simple document. You go ahead and put date, your e-mail account ID, your contact number, your name and phone number, the job ID, and the identifier that you have put for your device. Again you can find this document here off of the documentation.
And finally the last step is to simply pack your hard drive and ship it to Amazon. There are few little things that you have to take note of. Firstly, you need to include the power supply and any power cables and any interface cables, so if it’s USB 2.0, 3.0, esata, you need to include the USB cable or esata cable. If not, they’ll go ahead and return it back to you. You’ll also have to fill out that packing slip that I mentioned earlier and put that inside the box. And lastly, you’re going to send the package to the address that you had gotten from that create response command that we ran.
There are two other small things to note when you are shipping. Firstly, you’re going to make sure that the shipping label has that job ID on there. If not, they’re going to return it back. So you need to make sure you have the job ID in the shipping label. Secondly, you should also fill out a return shipping address. This is going to be different than the return shipping address that we have put in the manifest file. If they do not process your hard drive for some reason, if there’s a problem or something like that, they will return the hard drive to the shipping address on the shipping label. If they process your hard drive and they are able to transfer all of the data, they’ll return the hard drive to the shipping address that you have in the man of this about. So it’s important to put in of return shipping address on the label too. You can choose whatever carrier you’d like. I chose UPS. It’s good to have the tracking number, and they can go ahead and do all of this for you without a problem.
And that’s about it. It is a few steps and it does take a little bit of time the first time you do it. But after that, it’s pretty quick and it’s a great way to save a lot of data to the Cloud, an Amazon is also cheap for storage. So if you have a ton a day to you need to store, and you want to back it up somewhere other than in your house or on your external hard drive, then Amazon Web Services S3 is a great option.
I hope you enjoyed this tutorial Online Tech Tips. Please come back and visit.