Step by step guide to onboarding new developers to Upwind Cloud

This is a description of the general procedure that we follow for onboarding new customers. It gives developers a good understanding of the steps involved both on our side and their side.

Step 1: Configure access to cloud servers

Every customer is assigned a test and a production Linux servers. The URLs to access these systems are in the form cloudtest[Id].upwindtec.pt and cloud[Id].upwindtec.pt. [Id] is an identifier that is assigned to each customer. Customers on the Business plan are assigned a unique Id, other customers can share the same Id.

For security reasons, customers do not have access to production systems, these systems are managed by Upwind Tech staff. Customers’ application and data is moved to production system when requested by the customer.

Configuring customer access to test system:

sudo adduser [customername]

Password is generated randomly and provided to customer

Create a database of the same name on both the test and production servers. Username and password to access the database are the same as for the linux username. The database is hosted on a Postgresql server. Customers on the Business plan are assigned a unique Postgresql server. Customers on the shared plans share a single server with multiple databases.

Sample script:

docker exec -it postgres psql -U postgres
CREATE DATABASE [username]
WITH OWNER = postgres ENCODING = ‘UTF8’ LC_COLLATE = ‘en_US.utf8’ LC_CTYPE = ‘en_US.utf8’ LOCALE_PROVIDER = ‘libc’ TABLESPACE = pg_default CONNECTION LIMIT = -1 IS_TEMPLATE = False;
CREATE ROLE “[username]_user” WITH NOLOGIN NOSUPERUSER INHERIT NOCREATEDB NOCREATEROLE NOREPLICATION NOBYPASSRLS PASSWORD ‘password’;
CREATE ROLE “[username]_admin” WITH LOGIN NOSUPERUSER INHERIT NOCREATEDB NOCREATEROLE NOREPLICATION NOBYPASSRLS PASSWORD ‘password’;
GRANT ALL ON DATABASE [username] TO “[username]_admin”;
GRANT CONNECT ON DATABASE [username] TO “[username]_user”;
GRANT TEMPORARY, CONNECT ON DATABASE [username] TO PUBLIC;
GRANT ALL ON DATABASE [username] TO postgres;
\connect [username];

Step 2: Move existing data to PostgreSql

The customer might have existing data stored in Firebase Realtime, Firestore or other databases. Customer would typically provide a fake sample of existing data for setting up the database and testing the solution before going into production.

The steps below assume that the data was stored in either Firebase Realtime or Firestore, but the procedure is similar for other database types.

  • Export sample data to JSON format. Exporting from realtime database is straightforward, exporting from Firestore is more elaborate. Upwind Tech will provide a script to help in the process.

Sample Node.js script:

const { initializeFirebaseApp } = require(‘firestore-export-import’)
const { backups } = require(‘firestore-export-import’)
const serviceAccount = require(‘./credentials.[customer].json’)
const firestore = initializeFirebaseApp(serviceAccount)

const options = {
  docsFromEachCollection: 2, // limit number of documents when exporting
  refs: [], // Paths to export if not exporting whole DB
}

backups(firestore, [], options).then((data) => {
  var fs = require(‘fs’);
  fs.writeFile(“[customer].json”, JSON.stringify(data, null, 2), (err) => {
    if (err) {
      console.error(‘Error writing to file:’, err);
    } else {
      console.log(‘File written successfully!’);
    }
  });
});
  • Import the JSON data to Postgres. The script is created and tested by Upwind Tech team and is included in the setup fee.

Important considerations:

  • Decisions have to be made on which parts to keep as JSON and which parts to split into table columns or completely separate tables. Example, credit card information can be stored as a single encrypted JSON field whereas any fields used for sorting or searching stored in a separate column. Short lists can be stored as a single JSON field whereas long lists separated into secondary tables linked to the primary one.
  • It is important to keep the capitalization of all elements so that there is a perfect match between the JSON data, the client application, the server application (web services) and the database. In Postgres, capitalization is enforced by surrounding all table and column names with double-quotes.

Sample script:

-- import JSON data from Firebase to Postgres
-- It is important to keep all table and column names case-sensitive by using double quotes

CREATE TABLE lease_central(data jsonb);
INSERT INTO lease_central(data)
SELECT pg_read_file(‘c:/temp/migration_test.json’)::jsonb;

CREATE TABLE "Tenant AS
  SELECT get_key(pos, lease_central.data, ‘Tenant’) “Id”, je.*
  FROM lease_central
  CROSS JOIN
  JSON_TABLE (
    lease_central.data,
    ‘$.“Tenant”.*’ COLUMNS (
    pos for ordinality,
    “Salutation” text PATH ‘$.Salutation’,
    NESTED PATH ‘$.Address’ COLUMNS (
        “AdminArea” text PATH ‘$.AdminArea’
        )
    )
  ) AS je;
ALTER TABLE “Tenant”
DROP COLUMN pos,
ADD CONSTRAINT tenant_pkey PRIMARY KEY (“Id”);

Step 3: Build a skeleton web services application
The web services application can be created with any programming environment.

Note: We currently provide an API and samples for supporting ASP.NET and C# as our preferred environments for this type of applications, but other environments are also supported.

  • Using Visual Studio, create a new project based on ASP.NET Core (C#) as project template. Add the following NuGet packages to the project:
    Microsoft.EntityFrameworkCore.Design
    Microsoft.EntityFrameworkCore.Relational
    System.Linq.Dynamic.Core
    Npgsql.EntityFrameworkCore.PostgreSQL

The application will ultimitaly be built as a Docker container to be installed on the test and production servers. You can use the option to create the dockerfile using Visual Studio or use the simpler Dockerfile provided by Upwind Tech.

  • Create the C# models from the Postgres database, example:

    dotnet ef dbcontext scaffold "host=cloudtest1.upwindtec.pt;port=5432;database=[customername];username=[customername]_Admin;password=[customername]” Npgsql.EntityFrameworkCore.Postgresql --use-database-names --no-onconfiguring
    
  • [Optional:] Include UpwindtecCloudStorageUtils in the project dependencies either as a library file or a Nuget package (when available)

  • [Optional:] Make sure all entities implement the UpwindtecCloudStorageUtils.IBaseEntity interface or create a common interface for all entities and make that interface derive from UpwindtecCloudStorageUtils.IBaseEntity, e.g.

    public partial class Tenant : UpwindtecCloudStorageUtils.IBaseEntity
    {
        public string Id { get; set; } = null!;
        public string? Salutation { get; set; }
    

Step 4: Posting the web services application to the test server

Once the development and testing are finished on the local development system, the web services application can be posted to the cloud test server for additional testing before going into production.

Install and run Docker desktop on the development system, then run:

dotnet publish -c Release -o published
docker build -t customerwebservices . 
docker save -o customerwebservices.tar customerwebservices:latest
scp .\\customerwebservices.tar [customer]@cloudtestN.upwindtec.pt:/home/[customer]