Key tips for Successful AWS Cloud Migration

AWS cloud migration process

Amazon Web Services is a leading cloud computing service providing services for storing and analyzing data from cloud computing systems and devices. It offers scalable cloud computing environments for businesses to deploy. Migration is no easy job. The articles present Amazon’s basic framework for migration, describing basic migration steps relevant to all AWS migration projects.

The AWS migration consulting service is part of the AWS migration Program, which assists businesses in identifying and selecting the world’s best APN Partners with proven technical expertise and client success in specialized solution areas.

Migrate your application workloads

AWS is an excellent platform for Windows applications today and in the future. The company expects to generate an average annual revenue of 444% from Windows on the Amazon Web Services platform. SAP has been providing software integration to SAP Landscapes since 2011. AWS supports most types of instances in Cloud for a variety of different applications. VMware has partnered with AWS to develop and deliver VMware cloud-based workload solutions.

AWS Cloud Migration Phases

Amazon’s cloud migration plan covers five stages. Stage 2: Migration preparation and business planning. Create business cases for Amazon migration and define your goals. How can I improve my business processes? Determine a specific application to migrate to the cloud using this strategy. Phase 1: Developing plans.

AWS migration solutions

Our migration solution focuses all aspects of the process, technology and finances to ensure that your projects achieve the desired results for the organization.

– Migration Methodology

Moving millions of data and applications into the cloud requires a progressively oriented approach which includes evaluation readiness planning, migration and operational steps with all phases extending from the previous. AWS’pre-scriptive guides provide the method and techniques for each step of your migration journey.

– AWS Managed Services

AWS managed services (AMS) provides e-commerce and business-grade infrastructure that allows migration of manufacturing workloads within days. For compliance, AMS only updates the required applications for security reasons. AMS takes charge of running the cloud environment.

– AWS Migration Competency Partners

AWS migration expertise partner can assist you with completing migration faster. Global systems integrators and regional partners demonstrate successfully completion of multiple big migrations to AWS to gain migration competency partnership status.

– AWS Migration Acceleration Program

The Accelerated Migration Program at AWS (MAP) aims to improve efficiency of the organization’s operations by leveraging a comprehensive migration platform, with the investment to reduce the cost of migration to a new location.

– AWS Training and Certification

The AWS Training Team has the knowledge and expertise needed for cloud development for organizations. Cloud adoption of new technologies will be as fast as 20% if you employ a highly skilled workforce.

Why should I migrate to AWS Cloud?

Several enterprises that have moved into Amazon Web Services have reported that their IT infrastructure is undergoing a 36% upgrade.

Faster time to business results

Automating and data-oriented guidance helps simplify migration and decreases time and complexity. So a faster migration will reduce time to realize value in cloud migration. Ebooks: Maximize business value by using cloud technology for e-commerce.

Migration to AWS: 5 challenges and solutions

Migration to AWS can be a complicated process with many challenges. Here is one of the most common problems.

Plan for security

Challenge: Cloud environment security is not as secure as in-house environment and its security characteristics are very distinct. The potential risk is that the existing technologies will no longer work in a security vacuum when application migration from the cloud to the offsite.

Solution: Identify the security requirements for the application you are moving and ensure that the application meets the corresponding security standards. Find solutions for security issues on the AWS platform similar to the one you have on-premise.

Moving On-Premise Data and Managing Storage on AWS

How can you migrate data to the cloud?

Solution: AWS Direct connect provides a solution that can support enterprise applications that can provide highly reliable and dedicated Internet connectivity from the public clouds to their virtual premise. It also allows a synchronized workflow with an e-commerce site that gives users centralised visibility. CloudWatch can be utilized to remove impact from user migration. CloudWatch detects performance problems immediately and resolves them without users being affected.

Resilience for computation and network resources

Challenge:

Your application should be highly available to users on AWS. In cloud instances the application cannot be kept forever. A secondary requirement is enabling reliable connectivity — ensuring the availability of all the resources in a cloud.

Solution:

In calculation you can select reserved instances that will help you maintain your machine instance. Replications or using services managing deployment or availability such as Elastic Beanstalk are available for download on the web.

How do I manage my costs?

Several organizations have been moving to a cloud environment without identifying specific KPIs for how much money the cloud can cost. At the same time, the question can be answered whether the move was successful in the economic sense.

A cloud environment is very dynamic – the cost may change rapidly as you adopt services or scale your application. Solution Before moving, create an objective business plan and understand the value of your cloud migration to another cloud service.

Log Analysis and Metric Collection

Challenge: After migrating to AWS you can have an extremely responsive and dynamic system. Earlier methods for logging your software may not be applicable. The centralization of data would be essential to analyze log files on computers that were shut down yesterday.

Problem: Ensure the data is stored in a central place in the system to allow a central view of a log file. Utilize Amazon CloudWatch for centralized logging with Amazon CloudWatch Lambdas & Cognito.

What are the three phases of AWS cloud migration?

AWS tries to manage large migration processes by assessing, mobilizing, and deploying migration. Each phase builds upon previous phases. This prescriptive guidance plan covers the assessment phase as well as the mobilization phase.

Migrate to typescript – the advance guide

About a year ago I wrote a guide on how to migrate to typescript from javascript on node.js and it got more than 7k views. I did not have much knowledge on javascript nor typescript at the time and might have been focusing too much on certain tools instead of the big picture. And the biggest problem is that I didn’t provide a solution to migrating large projects where you obviously not going to rewrite everything in a short time, thus I feel the urge to share the greatest and latest of what I learned on how to migrate to typescript.

The entire process of migrating your mighty thousand-file mono-repo project to typescript is easier than you think. Here’s 3 main steps on how to do it.

NOTE: This article assumes you know the basics of typescript and use Visual Studio Code, if not, some details might not apply.

Relevant code for this guide: https://github.com/llldar/migrate-to-typescript-the-advance-guide

Typing Begins

After 10 hours of debugging using console.log, you finally fixed that Cannot read property 'x' of undefined error and turns out it’s due to calling some method that might be undefined: what a surprise! You swear to yourself that you are going to migrate the entire project to typescript. But when looking at the libutil and components folder and those tens of thousands of javascript files in them, you say to yourself: ‘Maybe later, maybe when I have time’. Of course that day never come since you always have “cool new features” to add to the app and customers are not going to pay more for typescript anyway.

Now what if I told you that you can migrate to typescript incrementally and start benefiting from it immediately?

Add the magic d.ts

d.ts files are type declaration files from typescript, all they do is declaring various types of objects and functions used in your code and does not contain any actual logic.

Now considering you are writing a messaging app:

Assuming you have a constant named user and some arrays of it inside user.js

const user = {
  id: 1234,
  firstname: 'Bruce',
  lastname: 'Wayne',
  status: 'online',
};

const users = [user];

const onlineUsers = users.filter((u) => u.status === 'online');

console.log(
  onlineUsers.map((ou) => `${ou.firstname} ${ou.lastname} is ${ou.status}`)
);

Corresponding user.d.ts would be

export interface User {
  id: number;
  firstname: string;
  lastname: string;
  status: 'online' | 'offline';
}

Then you have this function named sendMessage inside message.js

function sendMessage(from, to, message)

The corresponding interface in message.d.ts should look like:

type sendMessage = (from: string, to: string, message: string) => boolean

However, our sendMessage might not be that simple, maybe we could have used some more complex types as parameter, or it could be an async function

For complex types you can use import to help things out, keep types clean and avoid duplicates.

import { User } from './models/user';
type Message = {
  content: string;
  createAt: Date;
  likes: number;
}
interface MessageResult {
  ok: boolean;
  statusCode: number;
  json: () => Promise<any>;
  text: () => Promise<string>;
}
type sendMessage = (from: User, to: User, message: Message) => Promise<MessageResult>

NOTE: I used both type and interface here to show you how to use them, you should stick to one of them in your project.

Connecting the types

Now that you have the types, how does them work with your js files?

There are generally 2 approaches:

Jsdoc typedef import

assuming user.d.ts are in the same folder, you add the following comments in your user.js:

/**
 * @typedef {import('./user').User} User
 */

/**
 * @type {User}
 */
const user = {
  id: 1234,
  firstname: 'Bruce',
  lastname: 'Wayne',
  status: 'online',
};

/**
 * @type {User[]}
 */
const users = [];

// onlineUser would automatically infer its type to be User[]
const onlineUsers = users.filter((u) => u.status === 'online');

console.log(
  onlineUsers.map((ou) => `${ou.firstname} ${ou.lastname} is ${ou.status}`)
);

To use this approach correctly, you need to keep the import and export inside your d.ts files. Otherwise you would end up getting any type, which is definitely not what you want.

Triple slash directive

Triple slash directive is the “good ol’way” of import in typescript when you are not able to use import in certain situations.

NOTE: you might need to add the following to your eslint config file when deal with triple slash directive to avoid eslint errors.

{
  "rules": {
    "spaced-comment": [
      "error",
      "always",
      {
        "line": {
          "markers": ["/"]
        }
      }
    ]
  }
}

For message function, add the following to your message.js file, assuming message.js and message.d.ts are in the same folder

/// <reference path="./models/user.d.ts" /> (add this only if you use user type)
/// <reference path="./message.d.ts" />

and them add jsDoc comment above sendMessage function

/**
* @type {sendMessage}
*/
function sendMessage(from, to, message)

You would then find out that sendMessage is now correctly typed and you can get auto completion from your IDE when using from , to and message as well as the function return type.

Alternative, you can write them as follows

/**
* @param {User} from
* @param {User} to
* @param {Message} message
* @returns {MessageResult}
*/
function sendMessage(from, to, message)

It’s a more of a convention to writing jsDoc function signatures. But definitely more verbose.

When using triple slash directive , you should remove import and export from your d.ts files, otherwise triple slash directive will not work , if you must import something from another file use it like:

type sendMessage = (
  from: import("./models/user").User,
  to: import("./models/user").User,
  message: Message
) => Promise<MessageResult>;

The reason behind all these is that typescript treat d.ts files as ambient module declarations if they don’t have any imports or exports. If they do have import or export, they will be treated as a normal module file, not the global one, so using them in triple slash directive or augmenting module definitions will not work.

NOTE: In your actual project, stick to one of import and export or triple slash directive , do not use them both.

Automatically generate d.ts

If you already had a lot of jsDoc comments in your javascript code, well you are in luck, with a simple line of

npx typescript src/**/*.js --declaration --allowJs --emitDeclarationOnly --outDir types

Assuming all your js files are inside src folder, your output d.ts files would be in types folder

Babel configuration(optional)

If you have babel setup in your project, you might need to add this to your babelrc

{
  "exclude": ["**/*.d.ts"]
}

To avoid compiling the *.d.ts files into *.d.js , which doesn’t make any sense.

Now you should be able to benefit from typescript (autocompletion) with zero configuration and zero logic change in your js code.

The type check

After at least more than 70% of your code base is covered by the aforementioned steps, you now might begin considering switch on the type check, which helps your further eliminate minor errors and bugs inside your code base. Don’t worry, you are still going to use javascript for a while, which means no changes in build process nor in library.

The main thing you need to do is add jsconfig.json to your project.

Basically it’s a file that define the scope of your project and defines the lib and the tools you are going to work with.

Example jsonconfig.json file:

{
  "compilerOptions": {
    "module": "commonjs",
    "target": "es5",
    "checkJs": true,
    "lib": ["es2015", "dom"]
  },
  "baseUrl": ".",
  "include": ["src/**/*"],
  "exclude": ["node_modules"]
}

The main point here is that we need checkJs to be true, this way we enable type check for all our js files.

Once it’s enabled, expect a large amount of errors, be sure fix them one by one.

Incremental typecheck

// @ts-nocheck

In a file, if you have some js file you would rather fix later , you can // @ts-nocheck at the head of the page and typescript complier would just ignore this file.

// @ts-ignore

What if you just want you ignore 1 line instead of the entire file? Use // @ts-ignore. It will just ignore the line below it.

These two tags combined should allow you fix type check errors in your codebase in a steady manner.

External libraries

Well maintained library

If you are using a popular library, chances are there are already typing for it at DefinitelyTyped , in this case, just run:

yarn add @types/your_lib_name --dev

or

npm i @types/your_lib_name --save-dev

NOTE: if you are installing a type declaration for an organisational library whose name contains @ and / like @babel/core you should change its name to add __ in the middle and remove the @ and /, resulting in something like babel__core.

Pure Js Library

What if you used a js library that the author archived 10 years ago and did not provide any typescript typing? It’s very likely to happen since the majority of the npm models still use javascript. Adding @ts-ignroe doesn’t seem like a good idea since you want your type safety as much as possible.

Now you need to augmenting module definitions by creating a d.ts file, preferably in types folder, and add your own type definitions to it. Then you can enjoy the safe type check for your code.

declare module 'some-js-lib' {
  export const sendMessage: (
    from: number,
    to: number,
    message: string
  ) => Promise<MessageResult>;
}

After all these you should a have pretty good way to type check your codebase and avoid minor bugs.

The type check rises

Now after you fixed more than 95% of the type check errors and is sure that every library have corresponding type definitions. You may process to the final move: Officially changing your code base to typescript.

NOTE: I will not cover the details here since they were already covered in my earlier post

Change all files into .ts files

Now it’s time to merge the d.ts files with you js files. With almost all type check errors fixed and type cover for all your modules. What you do is essentially changing require syntax to import and putting everything into one ts file. The process should be rather easy with all the work you’ve done prior.

Change jsconfig to tsconfig

Now you need a tsconfig.json instead of jsconfig.json

Example tsconfig.json

Frontend projects

{
  "compilerOptions": {
    "target": "es2015",
    "allowJs": false,
    "esModuleInterop": true,
    "allowSyntheticDefaultImports": true,
    "noImplicitThis": true,
    "strict": true,
    "forceConsistentCasingInFileNames": true,
    "module": "esnext",
    "moduleResolution": "node",
    "resolveJsonModule": true,
    "isolatedModules": true,
    "noEmit": true,
    "jsx": "preserve",
    "lib": ["es2020", "dom"],
    "skipLibCheck": true,
    "typeRoots": ["node_modules/@types", "src/types"],
    "baseUrl": ".",
  },
  "include": ["src"],
  "exclude": ["node_modules"]
}

Backend projects

{
  "compilerOptions": {
      "sourceMap": false,
      "esModuleInterop": true,
      "allowJs": false,
      "noImplicitAny": true,
      "skipLibCheck": true,
      "allowSyntheticDefaultImports": true,
      "preserveConstEnums": true,
      "strictNullChecks": true,
      "resolveJsonModule": true,
      "moduleResolution": "node",
      "lib": ["es2018"],
      "module": "commonjs",
      "target": "es2018",
      "baseUrl": ".",
      "paths": {
          "*": ["node_modules/*", "src/types/*"]
      },
      "typeRoots": ["node_modules/@types", "src/types"],
      "outDir": "./built",
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules"]
}

Fix any addition type check errors after this change since the type check got even stricter.

Change CI/CD pipeline and build process

Your code now requires a build process to generate to runnable code, usually adding this to your package.json is enough:

{
  "scripts":{
    "build": "tsc"
  }
}

However, for frontend projects you often would need babel and you would setup your project like this:

{
  "scripts": {
    "build": "rimraf dist && tsc --emitDeclarationOnly && babel src --out-dir dist --extensions .ts,.tsx && copyfiles package.json LICENSE.md README.md ./dist"
  }
}

Now make sure your change your entry point in your file like this:

{
  "main": "dist/index.js",
  "module": "dist/index.js",
  "types": "dist/index.d.ts",
}

Then you are all set.

NOTE: change dist to the folder you actually use.

The End

Congratulations, your codebase is now written in typescript and strictly type checked. Now you can enjoy all typescript’s benefits like autocomplete, static typing, esnext grammar, great scalability. DX is going sky high while the maintenance cost is minimum. Working on the project is no longer a painful process and you never had that Cannot read property 'x' of undefined error ever again.

Alternative method:

If you want to migrate to typescript with a more “all in” approach, here’s a cool guide for that by airbnb team