Cloud Security 101

Cloud security is a standard model for many cloud platform. This can be easily deployed with wizard in many cloud platform for the virtual cloud network. If you are new to Cloud, you may need to know some basic Cloud security first.

Cloud Segregation

The basic security for cloud is data segregation. Data are secured by limiting exposure and restricted by user roles and security policies. Standard wizard network will split between private and public subnet. Most components should be within private subnet and restricted to a single access point. The plus point of cloud is that many of infrastructure security will be handled by cloud providers if you are using PaaS or SaaS.

Identity Security

If you are not using IaaS, majority of cloud efforts will be spend to maintain and design identity security. Identity Management is usually a common product provided by major cloud providers. The ease of integration and availability of single sign on remains the preferred choice for developers.

The basic of Cloud security is divided to cloud segregation and identity security. These are the standard security model that you will encounter if you are new to Cloud. The configuration of security is still not user friendly and it still required some basic knowledge of security to appreciate these security approach.

Another UTurn of COVID-19

Finally, Singapore cave in to reality checks of COVID-19 when measures are tightened again with work from home as default. This is largely due to the confusion on how the entire population should view this as an endemic. These messages create a false sense of security as some organisations are quick to resume work in office. Subsequently, these activities created clusters at workplaces as the rising cases strain on the medical resource. How should this be improved as we want to adopt an endemic approach?

Guidelines must be Clear on the Risk and Consequences

The typical decision markers in some organisations are quick to follow guidelines blindly. These are traditional views where some organisations or public trusted the guidelines without proper self evaluation of their risk managment and staffs profiles. There are risks when staffs have family with small kids. Of course, it is unfair to blame this on the guidelines. After, guidelines are not rules but to be taken with correct understanding of risks appetite. By now, we seen that these guidelines should highlight the risks and consequences that organisations must undertake if they choose to follow it to the tee.

COVID cannot be treated like Flu

By now, we know that COVID-19 cannot be treated like a common flu. It is a new set of policies, guidelines, risk management or even communication to be educated with the general public. After all, we are still learning how to manage Covid-19 like an endemic. These hard lessons should reflected and not to be treated lightly (pun intended)! It is recommended that organisations must setup these new set of guidelines if they want to adopt an endemic view to Covid-19.

The past few weeks of trying COVID-19 as an endemic is creating a state of confusion. This is largely due to the need of new management approach on how to consider Covid-19 as an endemic. It is obvious that a clear articulated communication plan must be thought out for the “blindsided”. In the meantime, it is better for majority to be conservative as safety comes first for kids and high risks groups.

Cloud Assessment Approach

One of the most common Cloud Challenges are customisation on premise application. Many assessment done to existing on premise will reveal these reasons why you should not move to Cloud. However, the assessment approach has been done incorrectly. Instead, it should be telling you on how you need to move to Cloud.

We find what we want to see

A good cloud assessment approach will not be one sided. It should have a balanced data and strategy to give the reader the right materials to make a decision. This is to prevent the bias of “finding what we want to see”. It is common to reason that heavily customised application to remain on premise instead of moving to Cloud. Vice versa, findings often point that Cloud should only handle lightly customised application.

Strategy to Cloud

As products move to SaaS, users will be forced to move customised applications to Cloud. It is important to provide strategy on how to migrate the customised applications. Many Cloud assessment will use Big Bang approach to review if the application can move to Cloud. You should also request for other strategy to be included in your Cloud assessment approach like phase transition or pilot migration.

Cloud assessment approach is often bias and skewed towards Big Bang migration approach. You should consider for other migration strategy when you are conducting your Cloud assessment. Thus, this will provide you a better picture to decide if you can migrate to Cloud.

RAD with Apex

Rapid Application Development (RAD) is around since the days of Visual Basic (VB). Since then, there is a preference of lightweight and low code framework with Agile approach. APEX (Application Express) is one such framework offered free with Oracle database. What does this signal for heavyweights like Spring or J2EE framework? Should you switch to low code like APEX?

Ease of Maintenance

The key push factor to switch to low code framework like APEX is the ease of maintenance. This helps users to focus on solving business problem rather then creating technical solution. Changes can be made and deploy quickly. This suits high complexity and exploratory problems where scope is largely unknown.

Business Driven Skillset

Low code environment places emphasis on outcome instead of being expert in thr programming language. Thus, platform like APEX suits users who are not technically inclined to setup entire infrastructure like servers and database. This emphasis business driven skillset like problem solving instead of technical skills. Users can also realised business solutions quickly with low code deployment.

The switch to low code platform like APEX will be growing. This is driven by the need for agile solutions to solve changing requirements. Usage of APEX also allows citizen developers to co-own and maintain the development.

Building a Stakeholder List

In many projects, stakeholder list is the output of stakeholder analysis activity. Not many are aware that you can actually anticipate the type of the project outcome from stakeholder list. These are the red flags that you need to lookout when you are reviewing the stakeholder list.

Get the Right Stakeholder

The creation of stakeholder list is important because you can review the roles of each stakeholders. It is common to see names that are just participants and not contributors. A right stakeholder must contribute and be accountable. If they are not fulfilling these objectives, it is worthwhile to reconsider these names.

Champions and Sponsors

For projects veterans, you will soon realise the importance of champions and sponsors. These two roles are critical success factors for all projects implementation especially digital transformation. Many of enjoyable successful projects I worked with have such players. These stakeholders make a great difference to the projects. They often provide morale, passion and clear direction to projects.

In many projects, there are moments where stakeholders are inserted based on political or namesake reasons. You should remove these stakeholders and ensure the right ones are there to contribute and be accountable. Champions and sponsors are necessary to provide the right support and ensure your projects are driven in the right direction.

Software Transition Phase for Cloud

During digital transformation, transition phase from legacy to cloud platform will be necessary when you are having complex web of tightly coupled applications. These are the symptoms where you need to consider transition steps before moving to your final To-Be state.

Data Flow Diagnostic

Data flow diagnostic will reveal a history of changes that have created data inconsistencies across integrated applications. You need to do a data flow analysis to understand the data sources and output. It is important to note that there will existing discrepancies in how data flow from your application to others. This is a key sign that you must correct these flow before migration to Cloud. If you choose to migrate the same data flow, you will definitely encounter the need to customise your cloud architecture to handle the legacy data flows.

Fog of War

Another need for transition phase occurs when you encounter “fog of war” between applications. This happens when application have no idea on what the data are being translated or used. This uncertainty creates a high risk if you are going to migrate directly to cloud. It is worth to invest a transition phase for risk mitigation and discovery of the unknown scope.

There two key symptoms listed in a complex coupled application environment are candidates for a transition phase in your migration to cloud. The transition phase will help to correct legacy design and gain clarity in your final cloud objective.

Cloud Expectations

Cloud expectations is a requirement that is often missed out. Unlike on premise, we often forgotten that Cloud is a product with constraints. Before you embark on a major cloud migration, it is recommended that you conducted a review of your existing architecture and expectations.

Understanding your Customisation

The common challenge with Cloud migration is customisation of existing on premise product. These customisation are often lost in translation over the years for legacy reason. Some of these customisation would even be uncovered during the cloud migration. Thus, it is important to do a deep analysis of your customisation. One guideline is called CEMLI (Configuration, Extension, Modification, Localization, and Integration) framework by Oracle.

Prepare your Cloud Expectations

Cloud expectations requirement must be translated to your critical success factors. These expectations will determine the effectiveness of your cloud migration. It is common to realise these expectations are missing. The user expectations will also mitigate the risks from Cloud migration. One such cloud expectation will be business continuity and minimum disruptions to operation.

There is a relationship of customisation and cloud expectations. Cloud is a product and will have larger gap for heavily customised on premise application. Thus, you will need to set clear cloud expectations to mitigate these risks. One common way is removal of customisation in preparation for cloud migration.

OTM Bulk Plan Design for 3PL

OTM (Oracle Transport Management) bulk plan is totally designed for 2PL. It is ideal when your orders can be assigned and planned nicely into your desired equipment. On the other hand, you may want to design differently for 3PL (3rd Party Logistics). These are some quick tips that you want to consider in your 3PL bulk plan design.

Masterdata is Not Complete

Bulk plan configuration is dependent on masterdata availability. For 3PL, masterdata is never complete because you are a third party. Your bulk plan will definitely run into error if you design based on a comprehensive set of masterdata. Thus, the rule of thumb for 3PL is to assume incomplete masterdata from your bulk plan design. The key method is to set default values for missing masterdata.

Speed vs Optimisation

As 3PL, you usually have contractual rates or preferred service providers. Thus, you can speed up bulk plan by auto assignment of your constraints values. On the other hand, there are not much optimisation required for your shipments. Due to missing masterdata, optimisation will also be inaccurate or even result in bulk plan performance issues.

You cannot configure OTM bulk plan using the standard approach for 3PL. You must always assign default values for your missing masterdata. It is also obvious that you cannot optimise your shipment planning fully for 3PL. Thus, you should go for bulk plan design for speed with constraints rather shipment optimisation.

COVID-19 is not Flu

Singapore approach to treat COVID-19 as an endemic like flu is still plagued with confusion and risks. The general public grappled with this thoughts as Singapore cases continue to rise steadily. The latest measure is HBL (Home Based Learning) for Primary 1 to Primary 5 from 27 Sep to 6 Oct 2021. This shows the risks that existed if we want to consider COVID-19 as flu or endemic. Is the public ready to view Covid-19 as endemic?

No more Qurantine

As long quarantines exist, it is difficult to imagine an endemic environment. As Singapore reach above a high rate of vaccination, we will have to accept the high rate of transmission as a norm. Very soon, there should not be a need for quarantine. This is because quarantine provides a contradiction to endemic approach. Instead, it is a matter of time where vaccinations will be mandatory. These view will be spur when vaccination are approved for kids below 12.

A Future by Vaccination

The return to endemic will lead us to view COVID-19 differently from other endemic illness like flu. For once, vaccination becomes a condition to many places or even jobs. If you have COVID-19, you can also be at home recovery if vaccinated under Home Recovery scheme. It may even be possible that vaccinated will not require quarantine in future.

COVID-19 is a perfect case study for future on how to transition a pandemic to endemic. In this transition, vaccination becomes a condition and nearly mandatory. It is interesting to see how things continue to unfold in Singapore as we move to a endemic Covid-19.

Achieving Carbon Neutral

Carbon neutral is not zero carbon as many will think. It actually means how you can reduce carbon emissions to zero. Broadly speaking, achieving carbon neutrality can be done by carbon offset or carbon reduction. These steps involves several factors to be conducted at an organisational level. I will briefly mention how this can be done for an product owner.

Carbon Offset with Integration

The way to offset your carbon usage in your application is to understand the usage patterns and consumptions. This means you will require a carbon footprint summary of the application inputs and outputs. Carbon offset can usually be achieved via direct integration to other applications. Integration reduces carbon usage by replacing paper transfer or email exchanges of processes between applications.

Carbon Reduction with Auto Scaling

Auto scaling is the most popular for carbon reduction in an application. Idling time are a wastage of carbon if servers are running at full capacity. Vice versa, under capacity creates backlogs and delays. This results in process congestion and exceptions handling. Auto scaling helps in right sizing capacity for efficiency and reduction of carbon usage.

Carbon neutrality can be done for product owner using two basic changes. You can offset your carbon by direct integration with your partners. At the same time, you should also enable auto scaling to maximise your server capacity and usage for different consumption period. Thus, your carbon reduction will be at an optimal level.