Apex Dashboard

After weeks of tearing hair over Oracle Apex and REST, I finally understand the basic approaches of using REST in Apex. Now, I finally can have fun with Apex Dashboard. Transactional data can now be presented nicely with Apex Dashboard. This is not to be mistaken with analytical Dashboard like Tableau. These are my review of Apex Dashboard which I manage to setup within a day or two.

Photo by Ethan
Why Apex Dashboard?

Data always comes in two flavors: a list or diagram. Thus, it is nice to see dashboard as a standard template in Apex. We do not really need complex dashboard like Tableau. This is because we are dealing with operational data. Therefore, a chart presentation in dashboard format can easily relate to the user. A flexible part of Apex dashboard is the ability to use SQL. Viola! That is why I can churn out a decent dashboard within a day!

Agile your Dashboard

The quick and lightweight dashboard suits operational users and agile method. I can easily pilot the dashboard and update user requirements with ease. The rapid iteration allows users to visualise the data in an efficient manner. Sometimes, I even redo my dashboard quickly because the older version is not to my liking. You can even mix and match dashboard in region in Apex page.

It is good to have a tool with simple lightweight features like dashboard. I do not need to separate dedicated dashboard tools for operational users. Finally, the dashboard template adds a nice additional and visual appeal to users. After all, we like to be appealed with good UI and visual data.

Advertisement

ORA-20001: You have exceeded the maximum number of web service requests per workspace.

Today, I got a rude shock when our Oracle Apex workspace display this error ORA-20001. Thankfully, this seems to be a common occurrence when setting up REST.

ORA-20001: You have exceeded the maximum number of web service requests per workspace.

Solution

There are actually two solutions to tackle this issue. Interestingly, the solution are applied differently. The first solution is to login Apex Instance with admin. You can select administration services link below the standard workspace login.

Go to Workspace > Security settings > Workspace Isolation > Maximum Web Service Requests

Increase the number for maximum web service requests. That will solve this issue. However, you may not find this option in the UI. If this is the case, you will need to amend this parameter with SQL apex_instance_admin functions. Start your database console for your apex autonomous database.

To view the parameters, you can run below SQL command.

select apex_instance_admin.get_parameter
(‘MAX_WEBSERVICE_REQUESTS’) as max_service_requests from dual;

To amend the parameters, you can run below SQL.

begin
apex_instance_admin.set_parameter
(‘MAX_WEBSERVICE_REQUESTS’, ‘500000’); — increase to 500000
commit;
end;

To our relief, this works and I wonder why is this necessary. REST is a common requests and perhaps this is good from a security standpoint. However, the error is too technical in nature to be solved by standard user. Another error to clear and we are good to go.

Oops! OTM Glog Not Backward Compatibility

The major issue with upgrading and migration to Cloud is backward compatibility. This process is like an evolution where an entire species will be extinct unless you evolved. Take OTM upgrade as an example, the nonsensical troublemaker Glog XML is forever not backward compatibility. This creates a ripple effect to legacy systems who have to conform to this change. While we are lamenting this issue, what should be done?

Photo by Ethan
Where to Change?

When the product do not support backward compatibility, this raise the issue on where you are willing to invest on change. Usually, there are two areas of change that could be done. The standard approach is to make the change on your end to conform to the upgraded Glog. What happens when the change impact is huge and take a long time? The other approach is to seek the vendor help to raising a product change request to support the backward compatibility?

Transitional Application

You will be in a deadlock when both approaches have difficulty to meet the required change. Thankfully for Cloud, you can now easily create transitional application to connect and translates the changes for backward compatibility. This is a common interim solution to maintain the change impact to minimum. The transitional application will be transition in nature and allows migration or upgrade activities to proceed. It also helps you to conduct full change to the new Glog at a later stage.

In the bid to upgrade for cloud, many applications are not backward compatibility to older versions. This creates a dilemma on where to manage the change. You may want to consider building transitional application minimise the change impact of backward compatibility like OTM Glog.

Data Synch for Cloud

Data synchronisation for Cloud will be a standard because of residing data in different platform. The ideal synchronisation timing is always real time. Unfortunately, this can be costly and unrealistic. This is because data changes only happens for a particular moment. Thus, a full synchronisation is not efficient. How should we design for real time synchronisation?

User Behavioural Pattern

All data went through a lifecycle process from creation to modification and/or view only. A way to synch your data is to understand user behavioural pattern. This helps you to add in your trigger when there are user actions. The trigger to synch is usually to maintain modification consistency across the source and target system. It is also wise to close off the data and move these data as read only. That way, costly modification will not be needed for these data.

Scheduled Synchronisation

The most common synchronisation is schedule job. This method is brute force approach and helps ensure data integrity. However, it can be costly if data grows exponentially. It is usually cost effective to synch for changes instead of a full data synchronisation. A targeted schedule synchronisation is the most effective.

User behavioural patterns help to determine the data synchronisation you required in Cloud. A full synchronisation is costly and you should add filter to synchronise efficiency.

RFP 101

Request for Proposal (RFP) is a good information gathering process. It is also one of the most annoying part in acquiring a new business. For the incumbent, RFP sounds like a formality because they have home advantage. Thus, RFP plays an important role in soliciting information and understanding of the vendors.

Setting Expectations

RFP is a good way of setting expectations. Sometimes, these expectations can translate to answers that you may not have thought of it. There are also instances where RFP is seeking answers to all available aspects. This will make you ponder the scope of RFP. Thus, you must always make use of this stage to align expectations. An experienced team in RFP will know how to clarify and balance expectations leading to a higher chance of success.

Template your RFP Responses

It is surprising to see many teams scramble to answer RFP at the last minute. RFP is a pretty standard process with minor variations across the board. If your team is constantly dealing with RFP, it is worth investing to template your RFP responses. There should be a cookbook for dealing with RFP. You should not approach each RFP as though it is new. There should also be a set of common clarification to review the RFP.

Creating and responding to RFP will get easier as you build up your templates and knowledge base. Unfortunately, not all teams are well adept to handling RFP. It pays to seek advice if you do not handle RFP often. With practice, RFP will be a breeze to win!

Why I Need a Cloud Platform

It is my third year of having a cloud platform. Year 1 happened due to major project. Year 2 was exploratory and I started to use simple features like object storage. It was only on the third year that I managed to explore and build an entire cloud architecture. There is no turning back because Cloud platform has provided me freedom to Agile and configured applications quickly. These is a reflection on why I need a Cloud Platform.

Self Service

The reality of self service finally reached the IT community. That self service comes in a Cloud Platform. As product owner, we usually need a box in the past to initiate any new product features. Such request takes a long time from procurement to deployment of the application module. All these efforts can now be self service at a cloud platform. You can quickly spin up a compute within minutes. You can even setup entire eCommerce architecture in a cloud platform.

Finally Agile

Another favourite advantage in having a cloud platform is to be fully agile. I am no longer constrained by dependencies like network, computes or storage. These are now easily available with cloud platform. In another words, you can agile innovations with high uncertainty. You can also RAD (Rapid Application Development) or DevOps your changes because cloud is configurable.

The freedom of being agile and ability to self service are key reasons on why I will continue to need a cloud platform. If is also rare to find organisations that are still on-premise. I am excited that the next years will be spend to migrate the needed to cloud platform one way or another!

REST and Filter

I just found a simple solution to improve the latency for REST in Oracle APEX. One way is to set filter criterion in REST GET during the data synchronisation. You can use query string “q” in the parameters to filter your GET results like SQL. E.g. myid=“123” is equivalent to

myrestpoint/getresult/?q={{“myid”:”$eq”:”123”}}

Targeted Synchronisation

Unlike previous synchronisation method, setting these criteria helps to reduce amount of data. A GET of 500 records typically loads within 10s. At the UI, 10s is a lot of time to consider. Thus, it is advisable to keep your GET in less than 5 seconds. In our design, I changed to this approach of setting up filter in the query parameter to optimise synchronisation. This help to shaved 7s from 10s.

Further Tuning

The REST is best kept simple for speed. Therefore, you should not have complicated extraction for GET. Adding validations during the REST transmission is also a big no. Each milliseconds adds up and creates a slowness slowness effect for the UI .

It helps a lot to add filter for GET during the synchronisation. You will need to be careful in each stage from GET. Everything will adds up if you are not careful with latency.

Time to REST for APEX

Creating and setting up REST API in Oracle APEX is becoming the easier part. After deploying the REST to APEX, it is “time to REST”; or so I thought! The coding inside REST PLSQL is creating a latency. This is because I added another procedure call within the PLSQL. For starter, the issues feedback is that UI did not refresh when user create or update the data. A closer look at APEX debug shows error 504 gateway timeout.

Debugging 504 Gateway Timeout in APEX

Till this day, HTTP status codes are the most cryptic in nature. This is because it can point to a variety mix of root cause. Previous unit test did not reveal any issues pertaining to this timeout. This happens when more testers have starting testing the system. After that, the 504 gateway timeout occured and APEX REST did not refresh the UI correctly.

Track and Trace 504

In order to track 504 gateway timeout error, we added a series of debug logs on APEX and target application. We saw that the payload was REST correctly to the target system. We also found that the target system was able to complete the PLSQL. So, what went wrong? Thus, we start to trace back the changes for the past 24 hours. It is a painstakingly process and we need to agilely rollback and retest the UI. We finally found the root cause when we comment out a procedure within the REST PLSQL. The codes have introduced a latency that causes a timeout to happen at the API gateway. Therefore, a quick fix is to increase the timeout at the API gateway.

Many will tear their hair and be sleepless over HTTP error codes. I am certainly one of them. This is a simple case where full testing will introduce latency and REST http errors. Unexpected behavior includes failure for APEX procedures to run its course. The result is UI or refresh issues. You can also see slowness in saving your data. Alas! It is time to tune for REST and there goes my REST time too.

APEX Format URL and Passing Parameters for Outside Context

Oracle APEX URL is a strict syntax. This is something I realise after days of trying to link to Apex page. You cannot shift the position of the format as it is using f?p syntax to pass the arguments. This includes parameters which you wanted to pass to the page. This is even more challenging if you want to pass parameters from outside context.

Passing Parameters

A deformed URL syntax will still load Apex page. However, you will find that the parameters cannot be received. You can check via the Javascript in Apex by reading the parameters argument and console.log the values. Another reason is authenticated page vs public page. It seems that the parameters can only be passed via public page.

Passing parameters from outside context to Apex will require a different configuration. For the time being, public page is friendly to receiving these parameters. I will still need go search on how to get it working via authenticated page.

APEX REST Enabled Interactive Grid

Hurray! I finally REST enabled Interactive Grid for Oracle APEX. It took a week to realise Interactive Grid cannot use REST data source fully. You will need to custom PLSQL for DML processing to save Interactive Grid data. Thanks to this helpful blog for giving a good example on how to do it. The coding is much easier than direct ORDS codes as APEX provide utility API to exec the REST data sources. You can refer to the blog for implementation but I will share the pitfalls to look out for.

Turn On Debug

The most important steps is to turn on APEX debug and enable level 8 debugging. You must also add debug log to your procedure and PLSQL code. Level 8 give you detailed logging. You can always switch back to level 4 when you find the error. Debug will show the JSON payload when you exec the REST service. Do copy this payload to Postman to test that it is formed correctly. A common mistake is the typo and missing quotes in the JSON format.

t_log_level := 8; — Application Express engine: messages when procedures/functions are entered

c_log_level_engine_trace constant

Validation Checks

REST JSON have no data types formatting during http transmission. Thus, you need to format and validate your data before you send and receive at target. If you are using date types, you will often hit character to number conversion error. Be sure to format your date to the desired date mask “DD-MM-YYYY H24:MM:SS” and convert to varchar. Do be careful of null values and put in validation checks to prevent this error. You should also validate for unique primary key. REST status code is notoriously not helpful. Do add loggers at TARGET system to check where the errors occurs.

It will take a few version upgrade before Interactive Grid can be REST out of the box. In the meantime, you just have to follow this custom PLSQL method for REST connectivity between APEX and target system.