JSON Data Guide Nuance 2

The previous post used numeric and string value to demonstrate how Oracle JSON Data Guide may change assigned datatypes to accommodate mixed data. This post demonstrates the same concept using JSON boolean values.

SQL> INSERT INTO json_data
  2  VALUES(1,'{"RAPTOR_LOST":true}');
1 row created.

SQL> INSERT INTO json_data
  2  VALUES(2,'{"RAPTOR_LOST":"YES"}');
1 row created.

SQL> SELECT JSON_DATAGUIDE(json_col) boolean
  2    FROM json_data
  3  GROUP BY json_pk;
BOOLEAN
----------------------------------------------------------
[{"o:path":"$.RAPTOR_LOST","type":"boolean","o:length":4}]
[{"o:path":"$.RAPTOR_LOST","type":"string","o:length":4}]

SQL> SELECT JSON_DATAGUIDE(json_col) boolean
  2    FROM json_data;
BOOLEAN
---------------------------------------------------------
[{"o:path":"$.RAPTOR_LOST","type":"string","o:length":4}]

Be careful to allow for these changes in your code.

Thanks for reading!

JSON Data Guide Nuance 1

The JSON Data Guide provides a parseable list of elements (including name, datatype and length) contained JSON data. This makes parsing much faster. by eliminating the need to interrogate for structure and then get the desired data from it.

But Data Guide has a few nuances to be aware of. And the first is that data types may change in summary entries. Allow me to demonstrate:

I’ll create two Oracle views showing Data Guide info for table named JSON_DATA. The table has two columns – one containing JSON data and another containin a numeric primary key.

The first view (ALL_DG) ignores the primary key field to show the Data Guide for all records in the table.

CREATE OR REPLACE VIEW all_dg AS
WITH dataguide_view AS
  ( SELECT JSON_DATAGUIDE(json_col) dataguide_rows
      FROM json_data )
SELECT edetails.*
  FROM dataguide_view,
       JSON_TABLE(dataguide_rows,'$[*]'
  COLUMNS epath VARCHAR2(100) PATH '$."o:path"',
          etype VARCHAR2(10) PATH '$."type"',
          elength NUMBER PATH '$."o:length"') edetails;
View created.

The next view (DG_PER_ID) includes the primary key. This means the JSON_DATAGUIDE call includes ALL JSON keys for the specific record denoted by the primary key.

CREATE OR REPLACE VIEW dg_per_id AS
WITH dataguide_view AS
  ( SELECT json_id,
           JSON_DATAGUIDE(json_col) dataguide_rows
      FROM json_data
    GROUP BY json_id)
SELECT json_id,
       edetails.*
  FROM dataguide_view,
       JSON_TABLE(dataguide_rows,'$[*]'
 COLUMNS epath VARCHAR2(100) PATH '$."o:path"',
         etype VARCHAR2(10) PATH '$."type"',
         elength NUMBER PATH '$."o:length"') edetails;
 View created.

Before I query from the views lets take a look at the data in the table. It contains 2 distinct keys – RAPTORS and CAVALIERS. The value for CAVALIERS is the number 4 which is the number of games it took them to dispatch the Raptors in this years NBA playoffs. The RAPTORS key has two values – the number 0 which denotes the number of games they won in the series and the phrase ‘LOST AGAIN’ which summarizes their playoff run…again…this year.

SELECT *
 FROM json_data;
 JSON_ID    JSON_COL
 ---------- -----------------------
          1 [{"RAPTORS":0},{"CAVALIERS":4}]
          2 {"RAPTORS":"LOST AGAIN"}

First a query from the detailed view including the primary keys. No surprises here.

SELECT *
  FROM dg_per_id;
JSON_ID EPATH        ETYPE  ELENGTH
------- ------------ ------ -------
 1      $.RAPTORS    number       1
 1      $.CAVALIERS  number       1
 2      $.RAPTORS    string      16

Now lets query from the summary view. Notice the type of data for the RAPTORS key changed.

SELECT *
  FROM all_dg;
EPATH       ETYPE      ELENGTH
----------- ---------- -------
$.RAPTORS   string          16
$.CAVALIERS number           1

This is happens because this query gets ALL keys. And since one value for RAPTORS is a number and the other is a string Data Guide chooses a datatype of string that will safely process both values.

Be aware of this when using Data Guide because it may require more datatype checks in your code. And remember to not make any assumptions.

Thanks for reading!