Oracle Prequel

April 1, 2018

Oracle Spokesperson April Fulz released the following brief statement today:

The Oracle database has undergone many changes to it’s release numbering scheme over the years that caused an undue amount of confusion. To alleviate the confusion we have decided to go right back to the beginning with Release 1.0.

You may know that we skipped release 1.0 in the early days for fear that nobody would trust the 1.0 release of any software. This was just the first rule of our release numbering scheme. Other highlights include:

  • Using incremental whole numbers for each release to denote forward movement and added features
  • Offset the intimidation of whole number increments by including decimal points
  • Append decrimental lowercase letters (i, g, c) to further offset the perceived complexity (thankfully industry trends such as Internet [i] and Cloud [c] arose to make the chosen letters more relevant)
  • The years between 13 and 17 were awkward for a lot of us so we planned to skip those release numbers and go directly to 18 (thankfully the year 2018 arrived just in time for this)
  • Just like the great movie franchises we always planned a prequel release of our flagship Database that we are now proudly branding Oracle RDBMS 1.0

More information will be made available closer to the final release of…the first release.

“JSON Schema” One

During the February 2018 ODTUG Code Talk about JSON and PLSQL an attendee asked the following question:

Can we validate JSON using JSON-schema?
(similar to XML-schema)?

I did not have enough information at the time to answer the question so I did some research afterwards and found that XML schema’s have three basic requirements:

1. Provide a list of elements XML contains
2. Provide documentation for humans and (XML) machines
3. Constrain data input to match the specified elements

I don’t know is there is such this as JSON Schema but Oracle satisfies the first two requirements easily with JSON Data Guides. Below is a explanation using code:

-- Create a table with a JSON column
CREATE TABLE json_data
 ( json_id NUMBER NOT NULL PRIMARY KEY,
   json_col CLOB,
CONSTRAINT must_be_json
CHECK ( json_col IS JSON ));

-- Insert a record with a single JSON key-value pair
INSERT INTO json_data
VALUES(1,'{"KeyOne":"ValueOne"}');

-- Query the dataguide for the table using:
-- 1) JSON_DATAGUIDE function to extract element info:
--    o:path = Path to the element from the JSON top
--    type = the data type of the element
--    length = the length of the element
-- 2) JSON_TABLE to translate JSON element into columns
--    and rows
WITH dataguide_view AS
  ( SELECT JSON_DATAGUIDE(json_col) dataguide_rows
FROM json_data
)
SELECT edetails.*
  FROM dataguide_view,
       JSON_TABLE(dataguide_rows,'$[*]'
    COLUMNS epath VARCHAR2(100) PATH '$."o:path"',
            etype VARCHAR2(10) PATH '$."type"',
            elength NUMBER PATH '$."o:length"') edetails;

EPATH     ETYPE   ELENGTH
--------- ------- -------
$.KeyOne  string        8

-- Now lets add some more complex elements such as arrays
INSERT INTO json_data
VALUES(2,'{"KeyOne":"ValueOne",
           "KeyTwo":"ValueTwo",
           "KeyThree":{"K1":"V1","K2":"V2"},
           "KeyFour":[1,2,3,4,{"K4":"V4"}]}');
 1 row created.

-- Running the dataguide query again shows this result:
EPATH         ETYPE   ELENGTH
------------- ------- ------- 
$.KeyOne      string        8
$.KeyTwo      string        8
$.KeyFour     array        32
$.KeyFour.K4  string        2
$.KeyThree    object       32
$.KeyThree.K1 string        2
$.KeyThree.K2 string        2

This output is for every element in any record in the table. Data Guide does not inherently constrain JSON data – whenever a new element is added or removed the Data Guide simply updates.

In the next post I’ll demonstrate using Data Guide to satisfy the third requirement – “Constrain data input to match the specified elements”.

Thanks for reading!

 

Validating And Constraining JSON

My February ODTUG Code Talk about Oracle PLSQL and JSON spawned many great questions from the attendees. I’ll answer them in this series of posts.

The first two of questions asked were:

  1. can you validate data being stored
  2. Can you create constraints?

I’ll answer them both below.

First up is an example of validating JSON using an index with the “ERROR ON ERROR” clause to check if JSON contains a specific key. The explanation and code for that is here.

And now for some code!

The code below demonstrates unique key constraints and value checking. It also shows triggers for validating and constraining JSON.

--
-- The WITH UNIQUE KEYS clause stipulates that
-- duplicate KEYS cannot exist at the same level in
-- the JSON data. Please note this refers to KEY
-- name and NOT the VALUE of a KEY:VALUE pair.
-- Also note it only check within the "current"
-- set of JSON data.
--
CREATE TABLE json_data
( json_id NUMBER NOT NULL PRIMARY KEY,
  json_col CLOB,
CONSTRAINT uniqueness_reqd
CHECK ( json_col IS JSON WITH UNIQUE KEYS ));

-- This row has unique KEYS of KEY1 and KEY2
INSERT INTO json_data
VALUES(1,'{"KEY1":"VAL1",
           "KEY2":"VAL2"}');

-- This row does not have unique keys at the top
-- level so it cannot be inserted
INSERT INTO json_data
VALUES(2,'{"KEY1":"VAL1",
           "KEY1":"VAL77"}');
INSERT INTO json_data
*
ERROR at line 1:
ORA-02290: check constraint (D.UNIQUENESS_REQD)
           violated
--
-- Unique indexes can be created to ensure that the
-- VALUE side of a KEY:VALUE pair is unique. In this
-- example the KEY name is KEY1. Unique indexes apply
-- across ALL records in the table.
--
CREATE UNIQUE INDEX unique_val1 ON
   json_data jd ( JSON_VALUE(jd.json_col,'$.KEY1'));
Index created.

-- Query what rows exist in the table
SELECT JSON_VALUE(jd.json_col,'$.KEY1') jv
  FROM json_data jd;
JV
--------------------
VAL1

-- Try to insert a value of VAL1 for KEY1 which
-- violates the unique index
INSERT INTO json_data
VALUES(3,'{"KEY1":"VAL1"}');
INSERT INTO json_data
*
ERROR at line 1:
ORA-00001: unique constraint (D.UNIQUE_VAL1) violated

-- This insert of VAL11 (eleven) succeeds
INSERT INTO json_data
VALUES(4,'{"KEY1":"VAL11"}');
1 row created.

-- Triggers can also be created to validate data
CREATE OR REPLACE TRIGGER json_trigger
   AFTER INSERT ON json_data
   FOR EACH ROW
BEGIN
  IF JSON_VALUE(:NEW.json_col,'$.KEY1') = 'BAD' THEN
    RAISE_APPLICATION_ERROR(-20000,'Bad JSON');
  END IF;
END;
/
Trigger created.

-- The trigger fires an prevents this insert based
-- on the VALUE for the KEY1 key
INSERT INTO json_data
VALUES(5,'{"KEY1":"BAD"}');
INSERT INTO json_data
 *
ERROR at line 1:
ORA-20000: Bad JSON
ORA-06512: at "D.JSON_TRIGGER", line 3
ORA-04088: error during execution of
                      trigger 'D.JSON_TRIGGER'

-- This insert succeeds
INSERT INTO json_data
VALUES(6,'{"KEY1":"GOOD"}');
1 row created.

-- Now query whats in the table.
SELECT JSON_VALUE(jd.json_col,'$.KEY1') jv
  FROM json_data jd;
JV
--------------------
VAL1
VAL11
GOOD

There is one more way to validate JSON. You can specify a procedure to run when the Data Guide for the JSON data changes. Beda Hammerschmidt of Oracle shows a demo of this.

Thanks for reading!