Wednesday, October 30, 2013

Script to copy content from server or local storage to S3


Following script can be used to copy data from linux server to S3 bucket:

#!/bin/bash
## Script to copy data from stage to S3 bucket

s3cmd sync /tmp/code-backup/ s3://s3-bucket-name/backup/ >> /var/log/daily-backup.log


s3cmd : Is the command to run S3 command
sync : It will sync data from server to S3 bucket
/var/log/daily-backup.log : Destination of logs

If you want to keep Backup of specific days in server and S3 following script can be used, in this 90 days are taken in consideration:
#!/bin/bash
## Script to copy data from stage to S3 bucket

find /tmp/code-backup/ -type f -mtime +90 -exec rm -f {} \;
s3cmd sync /tmp/code-backup/ s3://s3-bucket-name/backup/ >> /var/log/daily-backup.log


If daily backup needs to be taken of new files following script can be used:

#!/bin/bash
## Script to copy data from stage to S3 bucket

s3cmd put `find /tmp/temp-backups/ -type f -mtime -1` s3://s3-bucket-name/backup/ >> /var/log/daily-backup.log



 

How to proxy pass in Nginx

Following configuration can be used to proxy pass one URL to other in Nginx

location /abc {
    rewrite /abc(.*) /$1 break;
    proxy_pass http://redirect.com;
    proxy_redirect off;
    proxy_set_header Host $host;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}

 /abc : Directory you want to proxy pass
/abc(.*) : will work for both /abc and /abc/
$1 : Will not pass the /abc in the end of redirect.com

Setup fully configurable EFK Elasticsearch Fluentd Kibana setup in Kubernetes

In the following setup, we will be creating a fully configurable Elasticsearch, Flunetd, Kibana setup better known as EKF setup. There is a...