Extending E-Business Suite Natural Language Query (NLQ) to Support Custom Data Models

Extending E-Business Suite Natural Language Query (NLQ) to Support Custom Data Models

Johannes Michler PROMATIS Horus Oracle


Executive Vice President – Head of Platforms & Development

Extending E-Business Suite Natural Language Query to Support Custom Data Models

Oracle E-Business Suite (EBS) provides powerful Natural Language Query (NLQ) capabilities that allow users to interact with data using conversational language (see EBS Natural Language Query – Ask your E-Business Suite anything with AI). This functionality is robust for querying standard EBS tables in the following pre-defined domains:

  • Discrete Costing
  • Discrete Manufacturing
  • Dispute (AR)
  • Logistics (Inventory)
  • Maintenance
  • Order Management (OM)
  • Payment (AP)
  • People (HR)
  • Process Manufacturing (GME)
  • Project Manufacturing (PJM)

However, organizations often have custom data models that require similar access. Extending the NLQ framework to support custom data models enhances user experience and improves efficiency in retrieving relevant insights. Let's have a look how (surprisingly easy) this can be achieved based on a custom contract management solution we've implemented for one of our PROMATIS customers.

Revisiting the architecture

As shown in my previous blog posts the overall architecture of the E-Business Suite NLQ Feature looks as follows:

Extending E-Business Suite Natural Language Query to Support Custom Data Models
Architecture of E-Business Suite NLQ (see 3059877.1)

If we want to add an additional domain, e.g. from a custom contract management system we have to perform 3 high level steps:

  1. Provide one or multiple tables or (materialized) views in the EBS APPS Schema that contains relevant data
  2. Make the views/tables accessible to the XX_NLQ schema in a Read-Only fashion
  3. Replicate the data model as "empty tables" in the XX_NLQ_UTIL of the ADB
  4. Create a new AI Profile in the Autonomous Database
  5. Manage accessibility in EBS

Providing base data

First, you should evaluate which data to make available to the Natural Language Query model. I had good experiences with de-normalized data e.g. containing all contract data. You can provide this base data e.g. as a database view. If there are tons of data and the view has to access a complex and large base data model that may contain not so much relevant data, it might be helpful to create a dedicated materialized view that is e.g. refreshed on an hourly basis and that contains all relevant data in a quickly accessible manner.

Keep in mind that the Select AI approach shown here heavily relies on the names of tables and their columns: Remember to name things (especially columns) so that they can be understood through the Large Language Model (LLM).

After having identified the relevant tables/views/MVs for a domain create synonym and grant READ access to the XX_NLQ schema within EBS:

create synonym XX_NLQ.XX_CONTRACTS_V for APPS.XX_CONTRACTS_V;
exec AD_ZD.grant_privs('READ', ‘XX_CONTRACTS_V', 'XX_NLQ');

Verify that you can access the data e.g. by using the Oracle APEX "SQL Workshop".

Replicate data model in Autonomous Database

The next and very important step is to replicate the data model (but just the model, so without any data) within the Autonomous Database 23ai running in OCI. For this you might run:

create table XX_CONTRACTS_V (
CONTRACT_NUMBER NUMBER,
CONTRACT_STATUS VARCHAR2 (30),
CUSTOMER_NAME VARCHAR2(400),
CUSTOMER_NUMBER NUMBER,
CUSTOMER_CITY VARCHAR2(400),
CONTRACT_VOLUME NUMBER,
CREATION_DATE DATE,
...

If your data in E-Business Suite resides in a table, you can just copy the DDL from e.g. Oracle SQL Developer. If it is a view you might want to do "create table xx_test as select * from XX_CONTRACTS_V where rownum<5;" and then use that xx_test table to get the DDL.

Especially for columns containing a limited number of values it may be very beneficial to add a table comment, e.g. as follows:

comment on table XX_CONTRACTS_V is "This table contains information on customer contracts in the area of XYZ.";

comment on XX_CONTRACTS_V .CONTRACT_STATUS is "The column describes the contract status. Possible values are DRAFT for not yet signed contracts, ACTIVE for currently active contracts and TERMINATED for historic contracts.";

Creating an additional AI profile in ADB

After having made the metadata available in the Autonomous Database 23ai on OCI we have to create a new AI Profile for the newly created domain. This is straightforward:

Insert into FND_NLQ_DOMAIN_PROFILE_MAP (DOMAIN_NAME,BUSINESS_TERMS,PROFILE_NAME,PROFILE_PROMPT) values
('CONTRACTS','Contracts, Verträge, Kundenverträge','EBS_NLQ_CONTRACTS_PROFILE', NULL);

Insert into FND_NLQ_DOMAIN_TABLE_MAP (DOMAIN_NAME,OWNER,TABLE_NAME) values ('CONTRACTS','XX_NLQ_UTIL','XX_CONTRACTS_V');

Then run the nlq_util_ai_profiles.sql script provided from oracle again which will create a new SQL AI profile. Then check if the profile got created in USER_CLOUD_AI_PROFILES.

Manage accessibility in E-Business Suite

Create an additional Forms Function "EBS_NLQ_CONTRACTS_PROFILE":

Extending E-Business Suite Natural Language Query to Support Custom Data Models
New Forms Function for custom Contracts table

Then add the newly created function / permission to the permission sets of your choice. After compiling security this should bring the new domain into the Ask E-Business Suite UI:

Extending E-Business Suite Natural Language Query to Support Custom Data Models
New Contracts Domain available

Some words on security

The above-mentioned approach just controls overall access to the new domain. It is crucial that you consider row level access to the data you plan to expose to your users. For this have a look at the seeded functions defined in nlq_datasec_functions.sql and create an appropriate filter for your use cases (XX_CONTRACTS_V in my example): You definitely don't want to expose the salary of your CEO to everyone 😉 You can then assign a security function as follows:

DBMS_RLS.ADD_POLICY (
object_schema => 'APPS',
object_name => 'XX_CONTRACTS_V',
policy_name => 'XX_CONTRACTS_VPD',
function_schema => 'APPS',
policy_function => 'XX_CONTRACTS_FUNC1',
statement_types => 'select'
);

Summary

The E-Business Suite Natural Language Query feature is not just powerful "out-of-the-box". As described in this blog post it is quite easy to extend the functionality for your custom data models. In various PROMATIS projects we've seen that this gives impressive insights at very low efforts.

Extending E-Business Suite Natural Language Query (NLQ) to Support Custom Data Models

Running E-Business Suite "Select AI" (NLQ) on-premises

Johannes Michler PROMATIS Horus Oracle


Executive Vice President – Head of Platforms & Development

To setup an internal demo environment for PROMATIS, I recently ran into the problem, that I didn't have an Autonomous Transaction Processing / 23ai database at hand (available and reachable with a VPN). Instead of setting up a VPN (or going for public internet traffic) and an ATP in OCI I decided (for the sake of trying it) to build something on-premises that serves as the system where DBMS Select AI does its magic. Let's have a look.

Notice: Do not try this at home for real workloads. This is probably not a supported scenario.

Architecture of E-Business Suite Natural Language Query (NLQ) feature

In case you missed my previous blog post on the EBS NLQ feature you might want to have a look here.

Basically, here is the overall architecture again:

High level architecture of the solution; source: Oracle, see 3059877.1

The blog post here handles the "ADB on OCI" part in the top, where the metadata schema is handed over to the OCI (or OpenAI or other) Large Language Model. Usually that is running in a 23ai database on OCI; but for this blog post I replaced it with a simple docker container.

    Starting an on-premises Autonomous Transaction Processing DB

    The official Oracle Container Registry (which is always my first point to look for Oracle docker images - there are great containers from Gerald Venzl at https://github.com/gvenzl/oci-oracle-free as well though) reveals a "Oracle Autonomous Database Free" database here: https://container-registry.oracle.com/ords/ocr/ba/database/adb-free

    From a license perspective the container states: "The container image you have selected and all of the software that it contains is licensed under the Oracle Free Use Terms and Conditions which is provided in the container image. Your use of the container is subject to the terms of Oracle Free Use Terms and Conditions license.". This should be good enough for some internal testing. 🙂

    So, let's get the database up and running which is - if you already have a docker engine running and a reasonable fast internet connection a matter of 5-10 minutes and of just a single command:

    docker run -d \
    -p 1521:1521 \
    -p 8443:8443 \
    -e WORKLOAD_TYPE='ATP' \
    -e WALLET_PASSWORD=FancyPwd2025forNLQ \
    -e ADMIN_PASSWORD=FancyPwd2025forNLQ \
    -e ENABLE_ARCHIVE_LOG=False \
    --cap-add SYS_ADMIN \
    --device /dev/fuse \
    --name adb-nlq-ebs01 \
    container-registry.oracle.com/database/adb-free:latest-23ai

    After a few minutes you should be able to access apex under https://localhost:8443.

    Internal/ADMIN/FancyPwd2025forNLQ should work to sign in to both APEX as well as SQL Developer web.

    Special considerations during the NLQ for EBS setup

    Large parts of the setup work in the usual way as greatly described in MOS Note 3059877.1.

    Here are the "special things" I had to consider. Note: I am still using OCI and the LLama (in this case 3.3) model to run the interference. Should you be very sensitive regarding data privacy you might replace the backend with some local Llama model. Or you could go for the OpenAI GPT-4o model instead as well. The supported providers are:

    • OpenAI
    • Cohere
    • Azure
    • OCI
    • Google
    • Anthropic
    • Hugging Face

    So here are the special things to consider when running the ATP in a docker container:

    • When using Meta Llama as the LLM (by the way I was using LLama 3.3-70b for this test) you eventually must adjust the nlq_util_ai_profiles.sql. I'll do some more testing comparing version 3.1 and 3.3 of the model in a later blog post. The modification itself is simple though:
    l_llm_model VARCHAR2(100) := 'meta.llama-3.3-70b-instruct';
    • The authentication with OCI cannot use the usual resource principals. To overcome this, I've created a regular user and an API key. Put that user into an appropriate group and grant the group (instead of the instance principal) permission to use Generative AI services.
    • Then as the XX_NLQ_UTIL user of the (local) ATP create a credential using DBMS_CLOUD as follows (all in one line):
    exec DBMS_CLOUD.CREATE_CREDENTIAL(credential_name => 'adb_nlq_ebs01',user_ocid =>'ocid1.user.oc1..userid',tenancy_ocid =>'ocid1.tenancy.oc1..tenandid',private_key => 'allInOneLinePrivateKey',fingerprint => '12:34:56...');
    • Finally, again in the nlq_util_ai_profiles exchange the llm credentials:
     l_llm_cred VARCHAR2(30) := 'adb_nlq_ebs01';

    Testing OCI LLM Access

    To verify if everything is working in the ATP I used the following commands:

    begin
    DBMS_CLOUD_AI.DROP_PROFILE(
    profile_name => 'TESTGENAI'
    );
    commit;
    end;
    /
    
    BEGIN
    DBMS_CLOUD_AI.CREATE_PROFILE(
    profile_name => 'TESTGENAI',
    attributes => '{"provider": "oci",
    "credential_name": "adb_nlq_ebs01",
    "object_list": [{"owner": "XX_NLQ_UTIL", "name": "DUAL"}],
    "model": "meta.llama-3.3-70b-instruct",
    "temperature":"0.4",
    "oci_apiformat": "GENERIC",
    "region": "eu-frankfurt-1",
    "oci_compartment_id":"ocid1.compartment.oc1..12345"
    }'
    );
    commit;
    END;
    /
    
    SELECT DBMS_CLOUD_AI.GENERATE(prompt => 'What is Oracle DB',
    profile_name => 'TESTGENAI',
    action => 'chat')
    FROM DUAL;

    This should give you something as follows:


    Test accessing DBMS_CLOUD_AI with the LLM running in OCI

    Further setup

    The remaining setup was "as usual" and without any problems. Keep in mind that it will probably be necessary to set the application setting "ADB_HTTPS_HOST" to the name of your container host - since this is the CN that ords uses for the self-signed certificate.

    Summary

    The above procedure allows to quickly set up the EBS for NLQ feature with a local Autonomous Transaction Processing database running in a docker container on-premises. In the real world obviously I'd discourage of doing this and using a (supported) Oracle ATP instance in OCI instead. And hopefully, your E-Business Suite instance is running in OCI as well anyways. 😉

    Extending E-Business Suite Natural Language Query (NLQ) to Support Custom Data Models

    Solving issues with new E-Business Suite APEX integration for existing users

    Johannes Michler PROMATIS Horus Oracle


    Executive Vice President – Head of Platforms & Development

    In November 2024 Oracle released a new and greatly improved version of the E-Business Suite to Oracle APEX extension. When implementing this for 2 of our PROMATIS customers that had already an older solution for this I ran into issues when running an adop cycle after running the ADAPEXSCHREG.sql.

    Here is how we were able to overcome this:

    Issues with adop patch cycle

    The Problem we ran into was, that ADAPEXSCHREG "forcibly" updates the fnd_oracle_userid to have the password of the APEX Schema/user as INVALID. This may be a good idea in general, however if you have previously adspliced the APEX schema within E-Business Suite (which may be a good idea for various reason), this will break a follow-up adop cycle. When adop validates to connect to the application that has now an "INVALID" password it fails; thus breaking the ADOP cycle.

    As shown in this blog post I've been able to solve this by restoring the entry in fnd_oracle_userid from a backup; however you'll probably prefer to not even run into this situation.

      De-Register the custom application

      Fortunately it is easy to de-register the custom application. This process is described in 2085355.1 and basically all you have to do is the following:

      cd $AD_TOP/bin/
      perl adDeregisterCustomProd.pl


      De-Registering the application

      For at least one customer environment I had to run the process twice; once with an APP-CAPPS Application name and once with the all-lower version.

      Validate

      After de-registering I checked the fnd_oracle_userid (the record should no longer be anabled) and fnd_application table (the record should be vanished:


      Record gone from fnd_application

      
      

      You should then run an "adop -validate" and finally an (empty) patching cycle.

      Only then did I run ADAPEXSCHREG.sql which did then not cause any further issues. I hope Oracle will extend the ADAPEXSCHREG script to do some validations in that area.

      Summary

      With the above procedure it is possible to use a pre-existing APEX parsing schema with the new 12.2 APEX for EBS extension.

      Extending E-Business Suite Natural Language Query (NLQ) to Support Custom Data Models

      Restoring a single table from a OCI Cloud Manager backup

      Johannes Michler PROMATIS Horus Oracle


      Executive Vice President – Head of Platforms & Development

      When upgrading an Oracle E-Business Suite environment to both 23ai (see here) and leveraging the new, pre-built integration between EBS and Oracle APEX (https://www.linkedin.com/posts/johannes-michler-099892ab_enhanced-integration-with-ebs-122-and-apex-activity-7265427474598043648-lIsr) I accidentally screwed up the FND_ORACLE_USERID table of my testing environment.

      To avoid having to do a new Production to test clone to try again I investigated the (quite new) feature of RMAN to restore a single table within my PDB. Fortunately, I had taken a backup using E-Business Suite Cloud manager a few days ago; and since that table doesn't change a lot this backup should have all the relevant data.

      Let's see how the restore was performed:

      Connecting to OCI Backup Service

      Since the instance where the problem occurred was a testing instance that is not regularly backed up and the Cloud Manager tooling, I used to create the backup a few days ago, only created a temporary connection, I first had to setup the OCI Cloud Backup module.

      After downloading the module from http://www.oracle.com/technetwork/database/availability/oracle-cloud-backup-2162729.html the first step was to create a installation and configuration of the tool. This is done with the following command:

      java -jar oci_install.jar -host https://objectstorage.eu-frankfurt-1.oraclecloud.com -pvtKeyFile johannes.key.pem -pubFingerPrint a0:... -uOCID ocid1.user.oc1..yyyy -tOCID ocid1.tenancy.oc1..zzzz -walletDir /tmp/opc/wallet/ -libDir /tmp/opc/lib/ -bucket TESTD_23a1DB

        Connecting to rman

        The next step was to verify the backup. In my case since the control file still knew everything about the backup that was straightforward. But in some situations you may have to look into cataloging the backup first as described here.

        To verify the backup I ran:

        SET ENCRYPTION ON IDENTIFIED BY 'Top-SeCrEt2024' ONLY;
        run{
        ALLOCATE CHANNEL CH1 TYPE SBT PARMS='SBT_LIBRARY=/tmp/opc/lib/libopc.so, SBT_PARMS=(OPC_PFILE=/u01/app/oracle/product/23.0.0.0/dbhome_1/dbs/opcCDBTESTD.ora)';
        RESTORE DATABASE PREVIEW SUMMARY FROM TAG='TESTD250117_23A1';
        RESTORE DATABASE VALIDATE HEADER FROM TAG='TESTD250117_23A1';
        RELEASE CHANNEL CH1;
        }

        Running recover table

        After connectivity and access is established the actual restore is very easy. The documentation for this is https://docs.oracle.com/en/database/oracle/oracle-database/23/bradv/rman-recovering-tables-partitions.html#GUID-B8245791-9B75-4AE7-ACE9-D9EDBC2DACC0.

        The only challenge I faced was that the command does not allow to be executed in a run block with explicitly allocated channels (see 2105325.1). Instead, I had to configure the device first:

        configure channel device type sbt_tape FORMAT '%U' PARMS='SBT_LIBRARY=/tmp/opc/lib/libopc.so, SBT_PARMS=(OPC_PFILE=/u01/app/oracle/product/23.0.0.0/dbhome_1/dbs/opcCDBTESTD.ora)';
        CONFIGURE DEVICE TYPE sbt PARALLELISM 4;

        Then I was able to run the actual recover command as follows:

        recover table applsys.FND_ORACLE_USERID 
        OF PLUGGABLE DATABASE TESTD
        UNTIL SCN 12489980593436
        auxiliary destination '/u01/install/custom/aux' 
        remap table 'APPLSYS'.'FND_ORACLE_USERID':'OLD_FND_ORACLE_USERID';

        Let's break this down:

        • First, I had to provide the table to be restored
        • Then it is important to specify the "of pluggable database" clause; otherwise, the command will search for the table in the CDB
        • then you must of course provide the scn or sequence or "as of" date to which you want to recover
        • furthermore, you need to provide a (aux) location where the restore/recovery takes place. In my case for a (small) part of the E-Business suite that location used up to 75 GB of space during the restore
        • Finally, you can remap the table in order to not overwrite the original table. It's also possible to just create a dump file that can then be imported with impdb.

        After starting the command, it took roughly 30 minutes where the rman command created a temporary instance, restored the relevant datafiles, recovered the relevant table, dumped it using expdb and finally imported it under a new name into my actual database.

        export of the relevant table from the aux database

        After the process completed I was able to access my lost/broken data again 🙂

        Accessing the remapped and restored table data

        Summary

        The process allows restoring a single (or some) tables without restoring the entire database. Of course if you "know and plan" before there are more efficient ways as well (flashback or just a create table as select copy). But as a last mean, the approach shown here was very helpful.

        Do you need support running your Oracle E-Business Suite? With extensive experience in the Oracle ecosystem, PROMATIS ensures efficient data backup while also optimizing your Oracle Cloud environment strategically.

        Extending E-Business Suite Natural Language Query (NLQ) to Support Custom Data Models

        Migrating E-Business Suite with Cloud Manager to a private subnet

        Johannes Michler PROMATIS Horus Oracle


        Executive Vice President – Head of Platforms & Development

        Sometimes during the initial setup of E-Business Suite on OCI the setup is done with a public load balancer subnet for both the Cloud Manager web interface as well as the E-Business Suite environments. Hopefully such scenarios are always at least secured through IP Whitelisting and HTTPS-Only. But still, it may be necessary to improve security later on by migrating to private subnets for those load balancers.

        Here are the steps we recently performed in a customer project:

        Preparation of Cloud Manager

        For the Cloud Manager web access, the following initial steps are required:

        • create a new private subnet in the Virtual Cloud Network
        • add proper routing rules and security lists to the subnet
        • modify the cloud manager subnet security list
        • Modify the linux-firewall of cloud manager:
        sudo firewall-cmd --zone=public --add-rich-rule 'rule family=ipv4 source address=172.31.15.0/24 port port=8081 protocol=tcp accept' --permanent
        • Create a new load balancer in the private subnet and add the backend

        The actual "cutover" step then depends on the fact if you can use the same hostname as before:

        1. If you can use an unchanged hostname: change the DNS entry to point to the private IP address
        2. If you have to modify the hostname: re-wire Cloud Manager and IDCS; see Oracle documentation of Cloud Manager for "Update the Oracle E-Business Suite Cloud Manager Load Balancer Fully Qualified Domain Name (FQDN)" https://docs.oracle.com/cd/E18727_01/doc.121/f35809/T679330T679341.htm#cmg_manage_cmlbfqdn

        The final step is to retire the old load balancer for the Cloud manager.

        Preparation for E-Business Suite environment

        The initial steps for E-Business Suite are the same as for the Cloud Manager load balancer. For me it was not necessary to modify the linux firewall; this may depend on the OS of the E-Business Suite Apps Tier though.

        In addition to the basic setup, after the access through the new load balancer works, the following steps must be performed:

        1. un-register the environment from Cloud Manager
        2. create a new network profile with the private net
        3. re-register the environment with Cloud Manager


        New network profile with private subnet

        Summary

        Even though the overall procedure to migrate E-Business Suite Cloud Manager environments from a public to a private load balancer subnet is straight forward, the exact steps unfortunately are not officially documented. With the above guidance, the migration as performed by PROMATIS in the projects can be done, though – and I highly recommend doing this for security reasons in all production/important environments.