12/12/2020
aws9/1/2020
S3 is a cloud serverless object storage solution provided by AWS. That means you can store any files to S3 without maintaining the backend machine, which is AWS's responsibility. It is also very cheap, you can reference the pricing for each region you'll be using for the S3. For example, at the moment 2020 when I check the US East Ohio, the standard s3 first 50TB/month is $0.023 per GB, it also depends on how frequently you'll use that file the price will be changed. If you just want to store some files in S3 for a long time for archive purposes, you can use the S3 Glacier Deep Archive which is $0.00099 per GB.
So, what am I doing on this blog is to show some sample code that how to upload files to AWS by Node.js.
Setup IAM User with S3 permission
The first thing that needs to do before writing the Node.js code is, you have to set up a user with S3 permissions to access your S3 Bucket. I assume you already set up an AWS account.
AWSS3Node.js8/22/2020
Here a few commands if you want to get a list of git commit information.
This will list all the commits by the contributor.
git log --author="Contributor's name"
Get commit from 19 April 2020.
git shortlog -s -n --since="19 April 2020"
Get commit for the contributor since 19 April 2020.
git log --author="Your contributor" --oneline --since="19 April 2020"
Same commit log but make it pretty print.
git log --pretty="%C(Yellow)%h %C(reset)%ad (%C(Green)%cr%C(reset))%x09 %C(Cyan)%an: %C(reset)%s" --author="contributor" --since="19 May 2020 12:00:00 AM"
Quit the command line by :q.
Git8/5/2020
A quick note to show how to squash all git local commits into one.
A scenario will be, you've been check-in in lots of commits into your own local branch when you ready to push them to the origin branch. You don't want to check-in so many commits, e.g. depending on the developer, you might have 100 commits.
Here is a quick note for doing that in the command line.
First, make sure you are in the correct branch.
git checkout YourBranch
Then, use the soft reset to the origin branch.
git reset --soft origin/YourBranch
Now, at this moment, all your commit should become one, you just need to check-in that commit.
If you use Visual Studio 2019, it already has a feature that you can squash all local commits to one, however, if you still use Visual Studio 2015, you will have to do that in the command line.
Git7/30/2020
This blog shows how to backup your AWS DynamoDB table (on the cloud) to your local DynamoDB environment. It could help in case you want to do a prototype but using production data.
Export table to S3
In the first step, you need to login into AWS console's DynamoDB main page, choose the table then choose export to s3 bucket. At the moment I'm writing this, you'll need to set up the Point in time backup then you can export to S3. It says could charge, but once moved to s3 you can change it back.
Download data zip file
Once DynamoDB export table to s3, you should be able to find your data under {your bucket}/AWSDynamoDB/{unique guid}/data, download the zip file to your OS.
Following is the JSON file I download from the S3 bucket, noticed that no comma after each item object. Not sure why but you can just simply do replacement by adding a comma, after each item object.
AWSDynamoDB7/22/2020
This blog shows how to move DynamoDB local table and data to the DynamoDB Web Service, unlike SQL Database, you can use the backup file to restore to a different environment. So far at this moment, I'm writing, have not found out any other way to move Dynamo local to AWS Web service. So following sample code will just loop each item and use aws-sdk to put the request to the AWS DynamoDB web service.
DynamoDB table schema
Following is the sample table schema, you can manually add the table on AWS Dynamo cloud or use SDK to create a table programmatically.
AWSDynamoDB7/6/2020
AWS DynamoDB is one of the AWS services for the NoSQL database on the cloud. You can create an AWS account then go to the AWS console's DynamoDB section to create a table and add items. On the SQL database, you or someone need to create a server for the SQL database, you then access the SQL database on that server by SQL query, but on the other hand, AWS will be managing the server for this NoSQL, serverless DynamoDB database. You can access it by the HTTP request.
What about the local environment!? On the SQL side, you can set up the SQL database on your local, how about the DynamoDB? You can reference this blog for how to setup Dynamodb in your local environment. This blog will show how to create a DynamoDB table in your local environment.
The following example code was using Node.js running on Mac OS. npm version is 6.12.0, node version is v12.13.0.
AWSDynamoDB6/10/2020
AWS DynamoDB is a document database that provides single-digit millisecond performance, it is a serverless database, which means you can focus on use applications to access the DynamoDB and AWS will manage the server for you. It's also available in multiple AWS regions and has built-in Cache DAX.
Another good reason to use DynamoDB is the free tier provided by DynamoDB. up to 25 GB storage, 25 write capacity units, and 25 read capacity units are free. Compare to MongoDB, depend on the MongoDB cloud provider it might be 500MB storage if you try to look for a free tier.
Now, in this blog I'll show how to set up DynamoDB locally, also show a few simple codes to access local DynamoDB.
Download DynamoDB
Go to this link from AWS to download DynamoDB, the different region has different DynamoDB file, so choose the region you'll be using for DynamoDB. After unzipping the file, run the following command.
AWSDynamoDB6/6/2020
A quick code snippet to show use enum flag by using the power of two. So you can define your enum and set the value as following, this will make the combine flag be unique.
public enum MyType : uint {
Undefined = 0,
Type1 = 1,
Type2 = 2,
Type3 = 4,
Type4 = 8,
Type5 = 16
}
Or you can use the bit shift if you don't want to count the number.
public enum MyType2 : uint {
Undefined = 0,
Type1 = 1,
Type2 = 2 << 0,
Type3 = 2 << 1,
Type4 = 2 << 2,
Type5 = 2 << 3
}
Following are few way how to use this enum.
// set value
var myvalue = MyType.Type1 | MyType.Type2;
Console.WriteLine(myvalue);
var myvalue2 = MyType2.Type1 | MyType2.Type2;
Console.WriteLine(myvalue2);
// check value
if ((myvalue2 & MyType2.Type1) == MyType2.Type1) {
Console.
csharp6/1/2020
This quick note shows how to increase AWS EC2 Linux's EBS volume size.
First, log in to AWS, and select the target EBS, update the size. The volume size will update right away, but you'll do the following.
Next, ssh into the EC2 Linux instance, we would check the current Linux size.
After ssh into the EC2, type the following command so we'll know does the increased volume had changed or not.
lsblk
Following is the sample result when you type lsblk. Here we'll know that you have size as 16G but current EBS only has 8G.
nvme1n1 30G
nvme0n1 16G
|-----nvme0n1p1 8G
Now, the following command will change the size to use maximum size base on the above sample result. After you run the command, your EBS should be 16G now.
AWS