You are not Learning with Dashboard

3 years ago, I embarked on a mission to design and deploy dashboard. The overarching vision is to utilise dashboard as a tool to enable business analytics and to learn on how to improve business operations. From a moment of self reflection as of today, it is interesting to realise a key lesson from my mission. This lesson stems from the misconception that dashboard will allow users to learn and improve their business.

Users not Ready for Dashboard

One major feedback from users that they are unable to use the dashboard is due to the timeliness refresh of the dashboard. Hearing this remark makes one ponder why organisation should not view tool like dashboard as a magic pill to resolve business issues. It also jerk a reality check to me on the efforts that users can find any excuse not to use a tool. After all, it is always easy to blame tools like dashboard because it is passive and does not include machine learning capabilities to retort. Users are the missing learning components to analyse the data and perform corrective actions.

Ugly Truths with Dashboard

If management query the effectiveness of the dashboard, it is showing the ignorance and disconnect of management to the operations. The right question should be asking the users what they have learn from the dashboard. What else will you like to learn from the dashboard? Dashboard and data analytics reveal ugly truths from the operations. This is why many resist the adoption of Dashboard for the review of SOP (Standard Operating Procedures). Typical excuse you will hear is blaming the dashboard for data timeliness, incorrect data or data errors. It is not surprising because the truth leads to change! Change is something that operations or quality hate to indulge!

The key lesson from adopting Dashboard is to acknowledge that it will not automatically equates users to learn from Dashboard. Users will choose to ignore the ugly truth from dashboard to avoid changes. You must ensure at least monthly review of Dashboard check on what users learn and to improve the SOP. Dashboard is not a forum to complain on data timeliness and data errors.

Advertisement

Art of Monitoring Data

Monitoring data is the most simple form of data analytics. Yet, this is more difficult to implement and the outcome is usually less than satisfactory. Users have the misconception that monitoring data is the same as extraction of data. There is little follow through on what to do with the data. Without clear sense of objectives and outcomes. data monitoring is just a dump and show ordeal and a waste of efforts for developers.

Monitoring Objectives

It is usual to see request to monitor data for sake of monitoring. This request is usually raise by non users who are eager to see how the system performs. These views from these set of users often do not have historically bearing and are based on hearsay or past experiences. You must evaluate and review the objectives on why and what you are monitoring. Be a devil advocate to query the objectives instead of agreeing blindly.

Monitoring Outcomes

There are two key outcomes from monitoring objectives. Do we meet the objectives? Or we do not? Follow through actions from data monitoring is key. This part is often neglected and left to its own devices. Users and Non Users must communicate if the objectives are not met. In a nutshell, the monitoring outcomes and its respective action plan must be clear and actionable.

Learning the art of monitoring data is the first step to learn data analytics. Setting clear objectives are the key to effective monitoring. If you do not have clear action plans, do consider not to waste developer efforts and extract data for the sake of monitoring.

OTM VPD Tips

Oracle VPD (Virtual Private Database) is a quick security policy design for the database. This is also one of the features in OTM (Oracle Transportation Management) application. The setup of VPD can be quite painful if you are new to Oracle and OTM. In OTM, this can be a great struggle as you seek to maintain the VPD configured by your predecessor.

OTM VPD Error

OTM is notoriously famous for many VPD errors when things goes wrong. Many log errors will flag as policy violations. In some cases, your OTM UI screen (User Interface) will just freeze or give a weird error if you try to edit the object. There is another instance where you find your changes cannot be committed to database.

OTM VPD Tips

These are some common VPD tips to speed up your understanding in OTM.

  • Test out your issue with or without VPD.
  • Always consider VPD as one of the possible root cause of reported issue.
  • VPD may not work well if your SQL contain too many relational tables.
  • Keep your SQL in VPD short and simple.
  • If your VPD is complex, consider ways to simplify it by using a column field on the object.
  • Do not implement VPD for the sake of VPD.

VPD in OTM is quite a pretty neat extension of Oracle database. You will usually need some advanced knowledge of SQL to maintain and decipher the VPD. As a rule of thumb, it is good practice to keep VPD simple to spare your future headaches.!

Love Hate Macro

Anyway who have done macro will usually love to hate it! Macro can literally be a product of legacy achievement with headache and a consolation to excel users. Yes, this is how frustrating Macro can be. Are you someone who is trying to eliminate Excel macro?

Macro Standpoint

Macro is like a nowwhere application that is not data friendly. If you ever tried watching PiP (Picture-in-picture), you will know how macro will feel within Excel. I usually like to relate macro to RPA where we are trying to automate the wrong part of the processes with users reluctance to exit Excel. With the move of Microsoft to Cloud, macro will face a timely death in the next 5 years or so.

Macro Ocean to River

Sadly, I am a creator of macro but not a supporter of it. Building a macro is like trying to get the ocean into a river. Excel allows users flexibility to create different worksheets, rows and columns. Data is free flow and macro is trying to make sense of this ocean and empty it into a river. This is a total nightmare to your support team.

Moving forward, do reduce the reliance of macro as a low cost way of data input. Will the move to Cloud will present a shift to “Cloud macro”? Many upgrades and cloud migration are already putting a strain to these change to the macro. Where you will stand?

Running Cost of Data

Data permeates everywhere now. You may notice that more service providers are starting to charge for data storage and usage. Is data not free and created by you? Unfortunately, data do add up to big running cost. Why will you pay for the running cost?

Data Running Cost

For a layman understanding, your data running cost will mainly consist of the following:

  • Storage
  • Maintaining the storage
  • Resources to maintain the storage
  • Archival and housekeeping
  • Security and Privacy
  • Data transfer and exchange

Data as a Service

Paying for all the running cost of your data do come up to a significant amount. In fact, most of the data may not be required often and could be duplicated elsewhere. Data as a Service (DaaS) is gaining popularity for on demand data. Now, you can avoid spending hefty running cost for data like your data analytics. Instead, you can opt for DaaS to achieve similar objective.

The running cost of data is a growing concern for many as we move towards digital transformation and data analytics. You may consider DaaS as an option to reduce your cost and achieve the same desired outcome to your data analytics initiatives.

Location Data

Location is one of the most challenging data. This is why many countries created postal codes. That way, your mail will not get lost in delivery. In reality, many Asian countries are not using postal codes. Thus, the mismatch of rates to location for TMS often creates a lot of discrepancies. It is also one of key concerns for maintenance team as dirty location cannot be match to correct rates.

Find My Address

There is no standard way to name the address except to split to different fields like block, street, city or postal code. Small country like Singapore actually do not need a city field. New data form allows saving entire field naturally. You can also subscribe to map services to query by differing parameters like postal code or Lat/Lon. However, there is no standard return format and results varies among different maps providers.

Handling Location

If you are moving to digital transformation and data analytics, it is worth the investment to have some sanity to handle your location data. Commons techniques involves standardised usage of city or postal code. Usually, list of values (LOV) are used. Sometimes, you may face with refusal to clean up those data. If this is the case, you end up expending more cost and efforts go handle the chaotic data.

Location data is gaining importance for many applications especially TMS (Transportation Management System). However, it is a cat and mouse game to handle the different formats of location. Hopefully, as we move to smart data. All this can be automated and relieve the burden of users and application owners.

Extra XML tags

Data today and the past is very different. We are living in a period where two types of data overlap each other. The data today looks toward free format usually in JSON. On the other hand, the yesterday data exists in field types, xsd and fixed formats. Even an extra xml tag is enough to break such fixed mapping. Many legacy systems and integration team exist to support such old data formats. How will be the new data challenge these teams to transform?

Photo by Ethan Kwok
Data Transformation

Unlike digital transformation, many underlying data remains in legacy format. Data transformation remains at a slow pace for various reasons. Cost and time are common reasons. There is also a lack of data transformation experts to convert these legacy systems. Ironically, the key reason is humans resistance to change and transform legacy data. If it is not broken, do not change it remains the key mindset.

Data Transformation Team

Organisations continue to rely on legacy integration team for their existing systems. The tomorrow system will no longer require such team as more transformation takes place. Instead, these integration team must take ownership of data transformation and support the move to unstructured data. These evolved team will also be the subject matter expert for migration towards cloud and vice versa.

In conclusion, this article comes to my mind that resonate with data transformation team – Do you need a CDO (Chief Data Officer)? The trend will continue to push towards data of today! As digital transformation push forward rapidly, we must not neglect the historical data that are left behind.

Handling Dead Codes and Processes

Ever encountered dead codes and processes? I have been cleaning many of such dead codes and processes. Usually, these happens to system that have been around for more than 5 years. Dead codes and process are usually system features that users no longer required or understand. These are some quick approach to handle them for removal.

Happy Path Approach

When you upgrade an old system to a new one, you should first plot the happy path. In daily operations, 80% of the system usage are on happy path, 10% on exceptions and the rest are manual handling. Using happy path approach, your system migrates all the key essential functionalities. This way, you can also clean up the dead processes and its corresponding codes.

I Don’t Know

Users compartmentalise system usage in their thought processes. As times goes by, fanciful nice to have features are forgotten and languish in neglect. Querying uses on these features will get a response of “I don’t know”. As a rule of thumb, these features are considered “dead” and can be removed with no impact to existing system.

These dead codes and processes often results in degrades of system performance and unnecessary costs in support tickets. Thus, it is a good exercise to clean up dead codes and processes when you are upgrading your legacy system.

Is Data Analytics a White Elephant?

Data Analytics do have lots of promising usage. Soon, you realise that you will need good processes and action plan. Having a data analytics team that is disconnected to your business processes, operation team or application team is literally a white elephant.

Data Analytics in Silo

You will often see organisations setup Data Analytics team under the direction of sales and marketing. Often, the key focus of this team is the presentation of fanciful dashboard for top management to present to the CEO or the board. The data used are usually financial without engagement from the ground.

Data Analytics for Compliance

As many would encountered, Compliance team will highlight and suggest solutions for data in discrepancies. Data Analytics will seek to cleanse and translate the flagged data into a more complying nature. The end results are usually alerts and notifications for data discrepancies. As no root cause or SOP (Standard Operating Procedures) are in place, alerts will remain as alert with no further actions in place.

Building a Data Analytics team is not about creating pretty dashboards. You need SME (Subject Matter Expert) to engage a change management program and effect action plan. Do conduct sound RCA (Root Cause Analysis) instead of enhancing to cleanse the data.

Language Localisation

Localisation is a key component of TMS (Transportation Management System). Legacy system suffers from language localisation partly due to database design which forces the implementation to be encode in certain locales. Globalisation of application creates new solution architecture which default localisation. One of such settings is the configuration of languages and display text.

Source: Telegram
Language Perspective

One may think languages in system is pretty straightforward. However, to a system, English does not literally mean English? Due to complexity of languages, we seek some sanity to classify languages in system. One of such classification is ISO 639. However, this is not the standard adopted by system due to different computer organisations. For instance, Windows language encoding will vary from that in Linux. The application SME (Subject Matter Expert) must be familiar with language localisation encoding across various platforms.

Future Language localisation

At present and in near future, we will see NLP (Natural Language Processing) gaining popularity like Chatbot. There will be focus for system to localise naturally from user behaviour and geolocation. Translation are real time and ML (machine learning) driven. Data locales are no longer required as data can be stored as what it is and translated as such.