OTM VPD Tips

Oracle VPD (Virtual Private Database) is a quick security policy design for the database. This is also one of the features in OTM (Oracle Transportation Management) application. The setup of VPD can be quite painful if you are new to Oracle and OTM. In OTM, this can be a great struggle as you seek to maintain the VPD configured by your predecessor.

OTM VPD Error

OTM is notoriously famous for many VPD errors when things goes wrong. Many log errors will flag as policy violations. In some cases, your OTM UI screen (User Interface) will just freeze or give a weird error if you try to edit the object. There is another instance where you find your changes cannot be committed to database.

OTM VPD Tips

These are some common VPD tips to speed up your understanding in OTM.

  • Test out your issue with or without VPD.
  • Always consider VPD as one of the possible root cause of reported issue.
  • VPD may not work well if your SQL contain too many relational tables.
  • Keep your SQL in VPD short and simple.
  • If your VPD is complex, consider ways to simplify it by using a column field on the object.
  • Do not implement VPD for the sake of VPD.

VPD in OTM is quite a pretty neat extension of Oracle database. You will usually need some advanced knowledge of SQL to maintain and decipher the VPD. As a rule of thumb, it is good practice to keep VPD simple to spare your future headaches.!

Love Hate Macro

Anyway who have done macro will usually love to hate it! Macro can literally be a product of legacy achievement with headache and a consolation to excel users. Yes, this is how frustrating Macro can be. Are you someone who is trying to eliminate Excel macro?

Macro Standpoint

Macro is like a nowwhere application that is not data friendly. If you ever tried watching PiP (Picture-in-picture), you will know how macro will feel within Excel. I usually like to relate macro to RPA where we are trying to automate the wrong part of the processes with users reluctance to exit Excel. With the move of Microsoft to Cloud, macro will face a timely death in the next 5 years or so.

Macro Ocean to River

Sadly, I am a creator of macro but not a supporter of it. Building a macro is like trying to get the ocean into a river. Excel allows users flexibility to create different worksheets, rows and columns. Data is free flow and macro is trying to make sense of this ocean and empty it into a river. This is a total nightmare to your support team.

Moving forward, do reduce the reliance of macro as a low cost way of data input. Will the move to Cloud will present a shift to “Cloud macro”? Many upgrades and cloud migration are already putting a strain to these change to the macro. Where you will stand?

Running Cost of Data

Data permeates everywhere now. You may notice that more service providers are starting to charge for data storage and usage. Is data not free and created by you? Unfortunately, data do add up to big running cost. Why will you pay for the running cost?

Data Running Cost

For a layman understanding, your data running cost will mainly consist of the following:

  • Storage
  • Maintaining the storage
  • Resources to maintain the storage
  • Archival and housekeeping
  • Security and Privacy
  • Data transfer and exchange

Data as a Service

Paying for all the running cost of your data do come up to a significant amount. In fact, most of the data may not be required often and could be duplicated elsewhere. Data as a Service (DaaS) is gaining popularity for on demand data. Now, you can avoid spending hefty running cost for data like your data analytics. Instead, you can opt for DaaS to achieve similar objective.

The running cost of data is a growing concern for many as we move towards digital transformation and data analytics. You may consider DaaS as an option to reduce your cost and achieve the same desired outcome to your data analytics initiatives.

Location Data

Location is one of the most challenging data. This is why many countries created postal codes. That way, your mail will not get lost in delivery. In reality, many Asian countries are not using postal codes. Thus, the mismatch of rates to location for TMS often creates a lot of discrepancies. It is also one of key concerns for maintenance team as dirty location cannot be match to correct rates.

Find My Address

There is no standard way to name the address except to split to different fields like block, street, city or postal code. Small country like Singapore actually do not need a city field. New data form allows saving entire field naturally. You can also subscribe to map services to query by differing parameters like postal code or Lat/Lon. However, there is no standard return format and results varies among different maps providers.

Handling Location

If you are moving to digital transformation and data analytics, it is worth the investment to have some sanity to handle your location data. Commons techniques involves standardised usage of city or postal code. Usually, list of values (LOV) are used. Sometimes, you may face with refusal to clean up those data. If this is the case, you end up expending more cost and efforts go handle the chaotic data.

Location data is gaining importance for many applications especially TMS (Transportation Management System). However, it is a cat and mouse game to handle the different formats of location. Hopefully, as we move to smart data. All this can be automated and relieve the burden of users and application owners.

Extra XML tags

Data today and the past is very different. We are living in a period where two types of data overlap each other. The data today looks toward free format usually in JSON. On the other hand, the yesterday data exists in field types, xsd and fixed formats. Even an extra xml tag is enough to break such fixed mapping. Many legacy systems and integration team exist to support such old data formats. How will be the new data challenge these teams to transform?

Photo by Ethan Kwok
Data Transformation

Unlike digital transformation, many underlying data remains in legacy format. Data transformation remains at a slow pace for various reasons. Cost and time are common reasons. There is also a lack of data transformation experts to convert these legacy systems. Ironically, the key reason is humans resistance to change and transform legacy data. If it is not broken, do not change it remains the key mindset.

Data Transformation Team

Organisations continue to rely on legacy integration team for their existing systems. The tomorrow system will no longer require such team as more transformation takes place. Instead, these integration team must take ownership of data transformation and support the move to unstructured data. These evolved team will also be the subject matter expert for migration towards cloud and vice versa.

In conclusion, this article comes to my mind that resonate with data transformation team – Do you need a CDO (Chief Data Officer)? The trend will continue to push towards data of today! As digital transformation push forward rapidly, we must not neglect the historical data that are left behind.

Handling Dead Codes and Processes

Ever encountered dead codes and processes? I have been cleaning many of such dead codes and processes. Usually, these happens to system that have been around for more than 5 years. Dead codes and process are usually system features that users no longer required or understand. These are some quick approach to handle them for removal.

Happy Path Approach

When you upgrade an old system to a new one, you should first plot the happy path. In daily operations, 80% of the system usage are on happy path, 10% on exceptions and the rest are manual handling. Using happy path approach, your system migrates all the key essential functionalities. This way, you can also clean up the dead processes and its corresponding codes.

I Don’t Know

Users compartmentalise system usage in their thought processes. As times goes by, fanciful nice to have features are forgotten and languish in neglect. Querying uses on these features will get a response of “I don’t know”. As a rule of thumb, these features are considered “dead” and can be removed with no impact to existing system.

These dead codes and processes often results in degrades of system performance and unnecessary costs in support tickets. Thus, it is a good exercise to clean up dead codes and processes when you are upgrading your legacy system.

Is Data Analytics a White Elephant?

Data Analytics do have lots of promising usage. Soon, you realise that you will need good processes and action plan. Having a data analytics team that is disconnected to your business processes, operation team or application team is literally a white elephant.

Data Analytics in Silo

You will often see organisations setup Data Analytics team under the direction of sales and marketing. Often, the key focus of this team is the presentation of fanciful dashboard for top management to present to the CEO or the board. The data used are usually financial without engagement from the ground.

Data Analytics for Compliance

As many would encountered, Compliance team will highlight and suggest solutions for data in discrepancies. Data Analytics will seek to cleanse and translate the flagged data into a more complying nature. The end results are usually alerts and notifications for data discrepancies. As no root cause or SOP (Standard Operating Procedures) are in place, alerts will remain as alert with no further actions in place.

Building a Data Analytics team is not about creating pretty dashboards. You need SME (Subject Matter Expert) to engage a change management program and effect action plan. Do conduct sound RCA (Root Cause Analysis) instead of enhancing to cleanse the data.

Language Localisation

Localisation is a key component of TMS (Transportation Management System). Legacy system suffers from language localisation partly due to database design which forces the implementation to be encode in certain locales. Globalisation of application creates new solution architecture which default localisation. One of such settings is the configuration of languages and display text.

Source: Telegram
Language Perspective

One may think languages in system is pretty straightforward. However, to a system, English does not literally mean English? Due to complexity of languages, we seek some sanity to classify languages in system. One of such classification is ISO 639. However, this is not the standard adopted by system due to different computer organisations. For instance, Windows language encoding will vary from that in Linux. The application SME (Subject Matter Expert) must be familiar with language localisation encoding across various platforms.

Future Language localisation

At present and in near future, we will see NLP (Natural Language Processing) gaining popularity like Chatbot. There will be focus for system to localise naturally from user behaviour and geolocation. Translation are real time and ML (machine learning) driven. Data locales are no longer required as data can be stored as what it is and translated as such.

Is your Masterdata clean?

A recent project reminded me of the real world scenario where masterdata is never clean. We are being told that the textbox is a free text and city can be a street, district or even province. Of course, this creates a challenging downstream impact for your data analytics.

Masterdata POV (Point of View)
  • Decide the POV of masterdata. One system masterdata could be another system transactional data and vice versa.
  • Use a reference point and communicate a common language of masterdata.
  • Knowing your end game will decide what masterdata to be collected with a relevant POV.
Tips to clean your masterdata
  • If more than 50% of your masterdata data needs cleansing, it is worthwhile to drop this masterdata,
  • Know what to clean and not to clean for the sake of cleaning.
  • A clean masterdata exhibits consistent patterns while an unclean one is a total chaos.
  • Know your domain well to clean effectively!

Cleaning masterdata is a iterative process. You get better and resilient with practice. A good data sense is also advantageous. Good luck cleaning and may the force be with you!

Data Gap

Have you ever heard of Data Gap? This is a common issue from legacy application and processes. Data Gap are missing data in a required dataset. They are hidden risks to the Organisation and incurred unnecessary cost. In my latest project, data gap is the single biggest contributor in the support ticket. So, what and how can we do to remove Data Gap?

Signs of Data Gap
  • Increased support tickets with data updates as resolutions
  • Additional fields in manual process vs existing fields from integration.
  • Mismatch of data fields between source and target system.
  • Added requirement of data translation and exception logic used in integration and reporting.
Ways to handle Data Gap
  • Harmonisation of data for source and target system.
  • Business process reengineering of existing SOP (Standard Operating Procedures).
  • Removal of dead fields and update requirements to automate missing data retrieval
  • Change management of users to identify and manage Data Gap