My One45 Software Internship Experience

Hey guys! My name is Jason, and I just finished my Software Development Co-op placement at One45 for one year and I thought I would share my awesome experience. My first day on the job I walked into…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Jfrog Artifactory backup automation

Recently our team faced an interesting problem on how to automate Jfrog artifactory considering the team is growing and we may reach the storage of > 1 TB

Our artifactory is deployed using helm charts on OKE clusters (Oracle cloud) with postgres serving as backend for storing metadata and OCI S3 buckets serving as storage for repo binaries (content).

As on today Oracle Cloud is supporting only one data center in KSA and moreover there is a data residency requirement .Hence it is becoming mandatory to have a backup strategy for our critical applications.

We have brainstormed couple of strategies and have gone through the best practices doc shared by Jfrog. But wasn’t that meeting our requirements .

So we focussed on doing the following things

Point # 1 was achieved by

Now we see how to automate the backup of metadata.

Artifactory provides a rest api which helps to export system settings. By tweaking the settings.json payload we can enable to export only metadata and thereby excluding content.

The challenging part was the metadata zip gets stored in container’s filesystem.

There is an option to mount OCI S3 bucket on artifactory POD as volume mount and provide the mountPath as exportPath using s3fs.

s3fs allows Linux, macOS, and FreeBSD to mount an S3 bucket via FUSE.

But we felt it may not be best solution for our needs— if we use S3 as a filesystem there will be network delays and moreover S3 is not designed for atomic operations.

The best solution is to come up with a K8S CronJob with which will invoke bash script which will take care of exporting metadata , copying of metadata from one pod to another and finally using OCI CLI commands to put the metadata objects on S3 bucket.

Lets explore the logistics part — inorder to come up with this solution we need an environment — a docker image which will have the necessary packages and utilities packaged — read OCI CLI , kubectl etc.,

build this as CLI container image and use it in a cron job

metadata backup bash script

create oci cli config as secrets

This will ensure the artifactory metadata is being backed up daily midnight and for the content backup there is a just in time replication policies defined on your S3 buckets — which will ensure the same object present in both the bucket.

Hope you liked reading this article and found it to be useful.

~ Thanks

References

Add a comment

Related posts:

Como un amanecer de invierno fundido al sol

En su cara alargada y palida relucen dos brillantes ojos que se confunden con la apacible oscuridad marrón. Su boca esconde unos dientes alineados. Su pelo, solo se describe con una palabra:el sol…

Quad bike Safari abu dhabi

Mesmerizing Desert Landscapes: The desert is a landscape of unparalleled beauty, and a safari allows you to immerse yourself in its awe-inspiring vistas. As the sun sets over the horizon, the desert…

What Will You Do If You Start Coughing?

C OVID-19 is not the flu. We have a vaccine for the flu. We have anti-viral medications designed to treat the flu. We have a sense of what to expect when we catch the flu, and when it’s necessary to…