DevOps on AWS
DevOps adoption has become a crucial factor for any organization that wants to deliver solutions faster while creating greater efficiencies between developers and operations teams.
DevOps on AWS
As the market leader in cloud applications, AWS is constantly providing new services as well as best practices and guides to implement DevOps with AWS to the market. As an Advanced Consulting Partner, Skaylink has developed a DevOps method that includes both the best practices from AWS and also our own many years of experience in supporting customers on their cloud and DevOps journey.
A unique range of offerings is the result of our many years of experience:
- DevOps maturity check: Skaylink evaluates your current methods and practices, the workflows of your teams and the tools and solutions that you use to analyze your current implementation and identify potential for improvement.
- Enablement workshops: We have developed and held countless workshops on AWS Services with curated content that is based on AWS best practices and Skaylink experience. You can choose from a list of ready-to-book workshops ranging from AWS basics to advanced topics like Infrastructure as Code or DevOps Tooling.
- Use our expertise to carry out your projects with DevOps methods. Covering everything from Infrastructure as Code and continuous integration to continuous delivery/deployment and architecture design and implementation, we have many years of expertise in delivering end-to-end cloud-native projects.
- DevOps as a service: Our extension approach makes it possible for you to book experienced DevOps experts for your projects – with very simple contractual models.
- DevOps orchestration: With this approach, we combine our DevOps knowledge and our experience in your team. In other words, we work together with you on your projects and offer DevOps methods, coaching and support. Usually, this involves a long-term commitment that is streamlined with an effective transformation in mind.
Selected projects and solutions
This list provides a quick overview of the topics on which we work (no customer details will be listed for data protection reasons).
Qualification of developers in the field of AWS Elastic Beanstalk
- Quick qualification of developers in Elastic Beanstalk
- Extended workbench to support DevOps topics for development teams
- Operation of multiple productive applications with hundreds of environments that run on Elastic Beanstalk
- Use of Infrastructure as Code (AWS CloudFormation) and extended customer-specific adaptations (.ebextensions)
- CI/CD for quick availability with native AWS Services
- Extended security
Container orchestration with Kubernetes
- Design, implementation and operation of EKS clusters for multiple tenants
- Fully automated setup of clusters with Infrastructure as Code (Terraform, CloudFormation/EKSCTL)
- Setup and configuration of CI/CD pipelines for application availability – a completely automated end-to-end process
- Creation of a customer-specific container orchestration concept with best practices and architecture blueprints
ISV DevOps journey
- Support of ISV when upgrading current solutions to cloud-native services
- DevOps Maturity Assessment and Enablement through workshops
- PoC to move an application to the cloud
- Application of serverless frameworks (serverless.com, SAM) and AWS cloud-native services (Lambda, API Gateway, S3, Secrets Manager, DynamoDB)
- Cost optimization through the cloud-native introduction results in significant cost reduction compared to conventional instance-based solutions
Serverless application for secure exchange of passwords (OTP)
- Development of a secure portal for joint use of passwords with AWS cloud-native services
- Serverless application with AWS Lambda, API Gateway, DynamoDB, S3, KMS
- Step 1:
The client connects to the application, which is a static S3 website behind CloudFront.
- Step 2:
The client securely sends the secret text to a Lambda function that is running behind an API gateway.
- Step 3:
The Lambda function generates a random key to encrypt the secret text. Then, the encrypted secret is saved in a Dynamo DB.
- Step 4:
The ID of the encrypted content is combined with the encryption key (KEY), and both are encrypted again with a KMS key.
The resulting files are sent back to the original client.
- Step 5:
The recipient of the secret link sends the encrypted data to the backend. The Lambda function uses the same KMS key to unencrypt these files and to extract the secret ID and the KEY with which they were originally encrypted.
Note that the KEY is never saved in the backend, so even direct access to the Dynamo DB table is useless.
The accessed files are unencrypted and sent to the client again via HTTPS.
This might also interest you
No post found
Do you have questions for our experts?
Are you unsure where your digital journey should take you?
Just fill out the form to the right and we will be in touch with you shortly.