Giter Club home page Giter Club logo

quicksql's Introduction

Quick SQL

Node.js CI

Table of Contents

Overview

Quick SQL is a markdown-like shorthand syntax that expands to standards-based Oracle SQL. It is useful to rapidly design and prototype data models. Take a look at the example below:

Quick SQL

Previously, Quick SQL was only available within Oracle Application Express. This project reimplements the Quick SQL parser and translator into a JavaScript library which can be used in both NodeJS and the browser.

This repository also includes a Quick SQL to Entity Relationship Diagram module that can be used as seen in the example below:

Quick ERD

Install

npm install @oracle/quicksql

Translating Quick SQL into Oracle SQL Data Definition Language (DDL)

The Quick SQL to DDL translator is the product's core component, It allows users to transform a Quick SQL string into an Oracle SQL string.

The Quick SQL Syntax and Grammar are documented here

See below for examples of how to use this library.

Command Line Usage

NPXJS regitry already contains package with conflicting name. To disambiguate, please use command like this:

npx @oracle/quicksql test/apex/department_employees.qsql > output.sql

DDL NodeJS ECMA Script Module (ESM) Example

import { quicksql } from "@oracle/quicksql";
import fs from "fs";

try {
    const text = fs.readFileSync( './test/department_employees.quicksql' );
    console.log( new quicksql( text.toString() ).getDDL() );
} catch( e ) {
    console.error( e );
};

DDL NodeJS Common JS (CJS) Example

const { quicksql } = require( "@oracle/quicksql" );
const fs = require( "fs" );

try {
    const text = fs.readFileSync( './test/department_employees.quicksql' );
    console.log( new quicksql( text.toString() ).getDDL() );
} catch( e ) {
    console.error( e );
};

DDL Browser ECMA Script Module (ESM) Example

<script type="module">
    import { quicksql } from './dist/quick-sql.js';
    document.body.innerText = new quicksql(
`departments /insert 2
    name /nn
    location
    country
    employees /insert 4
        name /nn vc50
        email /lower
        cost center num
        date hired
        job vc255

view emp_v departments employees

# settings = { "prefix": null, "semantics": "CHAR", "DV": false }

`
    ).getDDL();
</script>

DDL Browser Universal Module Definition (UMD) Example

<script src="./dist/quick-sql.umd.cjs"></script>
<script>
    document.body.innerText = new quickSQL.quicksql(
`departments /insert 2
    name /nn
    location
    country
    employees /insert 4
        name /nn vc50
        email /lower
        cost center num
        date hired
        job vc255

view emp_v departments employees

# settings = { "prefix": null, "semantics": "CHAR", "DV": false }

`
    ).toDDL();
</script>

Transforming Quick SQL into an Entity-Relationship Diagram (ERD)

Requires a paid library. Review the usage here

Contributing

This project welcomes contributions from the community. Before submitting a pull request, please review our contribution guide

Security

Please consult the security guide for our responsible security vulnerability disclosure process

License

Copyright (c) 2023 Oracle and/or its affiliates.

Released under the Universal Permissive License v1.0 as shown at https://oss.oracle.com/licenses/upl/.

quicksql's People

Contributors

mig8447 avatar neilfernandez avatar spavlusieva avatar vadim-tropashko avatar zahariev-oracle avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

quicksql's Issues

Add support for non-paid ERDs

I'd love to see a way to quickly build ERDs without purchasing Joint+. It'd be nice to use that feature without needing to pay $3k USD.

Have other (free) or open-source options like MermaidJS or bpmn.io been explored?

Please Support /setnull Option Wherever /cascade is Supported to Enable Reverse Engineering Existing Schemas to Quick SQL

In order to reverse-engineer existing schemas into QuickSQL for visualization in the Oracle APEX QuickSQL diagram, support for the ON DELETE SET NULL foreign key constraint variant is required to round out support for the different kinds of foreign key constraints that can be expressed. Ideally, wherever /cascade is supported today, QuickSQL would also recognize /setnull and generate the appropriate ON DELETE SET NULL for the SQL in the foreign key constraint DDL it produces.

Boolean type uses the wrong case for the default value

This Quick SQL:

test
  approved boolean /default N

Yields this SQL:

create table test (
    id          number generated by default on null as identity
                constraint test_id_pk primary key,
    approved    varchar2(1 char) constraint test_approved
                check (approved in ('Y','N')) default on null 'n'
);

Note the default value n is lower case.

Missing customer_id in the diagram

Input:

customers
    name vc40
    customer_addresses /cascade
        street vc40
        city vc40
        state vc2
        zip_code vc10

orders
    total num
    customer /fk customers

products
    name vc40

order_lines
    order_id /fk orders
    product /fk products
    quantity num

Output:

{
  name: "customer_addresses",
  schema: null,
  columns: [
    {
      name: "id",
      datatype: "number",
    },
    {
      name: "street",
      datatype: "varchar2(40)",
    },
    {
      name: "city",
      datatype: "varchar2(40)",
    },
    {
      name: "state",
      datatype: "varchar2(2)",
    },
    {
      name: "zip_code",
      datatype: "varchar2(10)",
    },
  ],
}

example-cli has a particular test file hard coded

As defined in package.json, the example-cli script includes a hard coded call to a particular test file (./test/project_management.quicksql) which causes it to ignore any other file that the user attempts to specify and that appears to contradict the "Running the Example CLI" code sample provided. Simply changing the example-cli definition to just "./examples/cli.js" should fix the issue...

Escape character for /values

Would it be possible to have a an escape character for text values? I recently wanted to add data for financial years e.g. 22/23, 23/24 and 24/25, which didn't work. I've had similar issues adding text values with a quote e.g. O'tool.
Examples using '^' as an escape character:

  1. financial_year /values 22^/23, 23^/24, 24^/25

  2. surname vc60 /values O^'Hara, O^'Tool, O^'Meara

  3. start_date /values to_date(^'01-MAR-2024^',^'DD-MON-YYYY^'), to_date(^'01-APR-2025^',^'DD-MON-YYYY^')

Certain check constraint values can change column data type

The following Quick SQL:

change_history
  data_type vc20 /check VARCHAR2,CLOB,TSWLTZ

Generates this SQL:

create table change_history (
    id           number generated by default on null as identity
                 constraint change_history_id_pk primary key,
    data_type    timestamp with local time zone constraint change_history_data_type_ck
                 check (data_type in ('VARCHAR2','CLOB','TSWLTZ'))
);

Note that data_type is a timestamp with local time zone, not a varchar2(20) as defined.

/fk directive overrides INT data type to number

This Quick SQL:

test
  foo_id int /nn /fk foo

Yields this SQL:

create table test (
    id        number generated by default on null as identity
              constraint test_id_pk primary key,
    foo_id    number
              constraint test_foo_id_fk
              references foo not null
);

Note that foo_id is a number, not an int as defined.

Bug 35950582 - /INSERT GENERATES EXTRA FK COLUMN

quicksql input producing the wrong insert statement with an extra column:

team_members
   username
projects /insert 10
   name
   project_lead /nn /references team_members

It generates the following insert statement (I ignored the before and after, as just want to focus on the insert that is broken):

...
insert into projects (
    id,
    team_member_id,
    name,
    project_lead
) values (
    1,
    94,
    'Augusta Wilkerson',
    'N/A'
);
...

Please note there is an extra generated column called team_member_id. project_lead is the fk, and should have the random IDs, and team_member_id should not be there.

Quoter identifiers in QSQL are concatenated incorrectly

Quoted identifiers example from small_tests.js:

output = quicksql.toDDL( 
        `"Test" 
            "CamelCase"
            x   [coMment]  
            2   --coMment2  
    `);

//console.log(output);
assert( " 0 < output.indexOf('create table \"Test\"') " );                                     
//assert( " 0 < output.indexOf('Test_id_pk') " );                                     
assert( " 0 < output.indexOf('\"CamelCase\"') " );                                     
assert( " 0 < output.indexOf('comment on column \"Test\".x is ') " );                                     
assert( " 0 < output.indexOf('comment on column \"Test\".x2 is ') " );   

The second, commented out assertion is broken. The full output is:

create table "Test" (
    id             number generated by default on null as identity
                   constraint "Test"_id_pk primary key,   -- broken
    "CamelCase"    varchar2(4000 char),
    x              varchar2(4000 char),
    x2             varchar2(4000 char)
);

comment on column "Test".x is 'coMment';

Presumably, there is a bug everywhere in the code where the name is concatenated. I suggest introducing a dedicated concat method which would handle quoted identifiers correctly.

/default needs better support for Oracle functions and sequence.nextval

This model:

queues
  queue_id num /pk
  created dt /default systimestamp

Yields this SQL:

create table queues (
    queue_id      number generated by default on null as identity
                  constraint queues_queue_id_pk primary key,
    created_dt    varchar2(4000 char) default on null 'systimestamp'
);

Note that systimestamp is treated as a string, when it should be a function. I see sysdate works fine in the version of QS that ships with APEX 23.2, so some code was added to handle that. However, that seems to be broken on the current version here: https://krisrice.io/quick-sql-standalone.html

An allow list approach will probably not scale anyway, especially considering things like my_owner.my_sequence.nextval.

I recommend adding a new directive (e.g., /defaultraw), which would not wrap the words(s) after it as a string.

Boolean Datatype 23c (Was: Order of 'default' with boolean produces invalid SQL)

This Quick SQL:

foo
  bar /boolean /default y

Creates this SQL:

create table foo (
    id     number generated by default on null as identity
           constraint foo_id_pk primary key,
    bar    varchar2(1 char) constraint foo_bar
           check (bar in ('Y','N')) 
);

Oracle throws ORA-00907: missing right parenthesis.

If default on null 'y' is moved before the constraint then the statement runs fine.

Unique constraint name unexpectedly leaves out table prefix if used

If you explicitly name your table with a prefix, its name is used correctly in constraint names. For example:

e01_t
   s vc20 /nn /unique

This produces the expected result of:

create table e01_t (
    id    number generated by default on null as identity
          constraint e01_t_id_pk primary key,
    s     varchar2(20 char)
          constraint e01_t_s_unq unique not null
);

However, if you use the #prefix setting to add the prefix, then primary key constraint correctly includes the e01 table prefix, but the unique constraint prefix does not.

t
   s vc20 /nn /unique
#prefix: e01

This produces the unexpected result (missing the e01_ prefix in the unique constraint name:

create table e01_t (
    id    number generated by default on null as identity
          constraint e01_t_id_pk primary key,
    s     varchar2(20 char)
          constraint t_s_unq unique not null
);

Generated PL/SQL Table API Lists Foreign Key Columns Twice, Producing Error

Reporting an issue from APEX discussion forum thread:

https://forums.oracle.com/ords/apexds/post/quick-sql-bug-in-api-generation-4298

Quick SQL Testcase:

feature_masters
    name

app_feature_items
   feature_master_id number /fk feature_masters
   name vc255 /nn

#api:y

Bug is that this produces PL/SQL APIs with the feature_master_id column repeated twice as follows:

    procedure get_row (
        p_id        in  number,
        p_feature_master_id   out  number,
        p_feature_master_id   out  number,
        p_name   out  varchar2
    );

    procedure insert_row (
        p_id        in  number default null,
        p_feature_master_id    in  number default null,
        p_feature_master_id    in  number default null,
        p_name    in  varchar2 default null
    );

    procedure update_row (
        p_id        in  number default null,
        p_feature_master_id    in  number default null,
        p_feature_master_id    in  number default null,
        p_name    in  varchar2 default null
    );

Audit columns table directives only work with no space, contrary to the documentation

Documentation states:

Directive Description
/auditcols, /audit cols, /audit columns Automatically adds an UPDATED, UPDATED_BY, INSERTED, and INSERTED_BY columns and the trigger logic to set column values.

I'm unable to get /audit cols or /audit columns to work correctly in version 1.15, only /auditcols is working.

My example quick sql:

departments /audit cols
    name 
    employees /auditcols
        name /nn vc50
    another table /auditcols
        name

Tested at https://krisrice.io/quick-sql-standalone.html

Erroneous alter statement for "pk"="guid"

Input:

# settings = {"pk":"GUID"}
students /insert 2 
    name

Output:

create table students (
    id      number default on null to_number(sys_guid(), 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX') 
            constraint students_id_pk primary key,
    name    varchar2(255 char)
);



-- triggers
create or replace trigger students_biu
    before insert or update
    on students
    for each row
begin
    if :new.id is null then
        :new.id := to_number(sys_guid(), 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX');
    end if;
end students_biu;
/


-- load data

insert into students (
    id,
    name
) values (
    1,
    'Dean Stephens'
);
insert into students (
    id,
    name
) values (
    2,
    'Brett Matthews'
);

commit;

alter table students
modify id generated always  as identity restart start with 3;

Expected:

create table students (
    id                             number default on null to_number(sys_guid(), 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX') 
                                   constraint students_id_pk primary key,
    name                           varchar2(255 char)
)
;

-- load data
 
insert into students (
    name
) values (
    'Mark'
);

insert into students (
    name
) values (
    'Words'
);


Sample Data Model (Departments and Employees) produces bad SQL. Missing semi-colon, and bad BLOB insert.

Although this is uncoupled from APEX, there are a few basic data models we have preloaded. Testing, and realized that some do not work. Here is one that is broken:

departments /insert 4
   name /nn
   location
   country
   employees /insert 14
      name /nn vc50
      email /lower
      cost center num
      date hired
      job vc255

view emp_v departments employees

This is a simple departments and employees table with a view and some fake rows.

Here is the script it generates:

-- create tables

create table departments (
    id          number default on null to_number(sys_guid(), 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX') 
                constraint departments_id_pk primary key,
    name        varchar2(255 char) not null,
    location    varchar2(4000 char),
    country     varchar2(4000 char)
);


create table employees (
    id               number default on null to_number(sys_guid(), 'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX') 
                     constraint employees_id_pk primary key,
    department_id    number                     constraint employees_department_id_fk
                     references departments,
    name             varchar2(50 char) not null,
    email            varchar2(255 char),
    cost_center      number,
    the_start        date,
    job              varchar2(255 char),
    image            blob
);

-- table index
create index employees_i1 on employees (department_id);




-- triggers
create or replace trigger employees_biu
    before insert or update
    on employees
    for each row
begin
    :new.email := lower(:new.email);
end employees_biu;
/


-- create views
create or replace view emp_v as
select
    departments.id           department_id,
    departments.name         department_name,
    departments.location     location,
    departments.country      country,
    employees.id             employee_id,
    employees.name           employee_name,
    employees.email          email,
    employees.cost_center    cost_center,
    employees.the_start      the_start,
    employees.job            job,
    employees.image          image
from
    departments,
    employees
where
    employees.department_id(+) = departments.id/

-- load data

insert into departments (
    id,
    name,
    location,
    country
) values (
    1,
    'Delivery',
    'Garukme',
    'IL'
);
insert into departments (
    id,
    name,
    location,
    country
) values (
    2,
    'Manufacturing',
    'Covdiiku',
    'MH'
);
insert into departments (
    id,
    name,
    location,
    country
) values (
    3,
    'Sales',
    'Imaerosed',
    'VU'
);
insert into departments (
    id,
    name,
    location,
    country
) values (
    4,
    'Manufacturing',
    'Cugewpap',
    'CR'
);

commit;

insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    1,
    4,
    'Elnora Payne',
    '[email protected]',
    84,
    sysdate-86,
    'Analyst',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    2,
    1,
    'Katie Anderson',
    '[email protected]',
    78,
    sysdate-9,
    'Architect',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    3,
    3,
    'Myrtie Maldonado',
    '[email protected]',
    7,
    sysdate-87,
    'Salesman',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    4,
    1,
    'Carrie Carlson',
    '[email protected]',
    12,
    sysdate-77,
    'Manager',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    5,
    3,
    'Lucas Larson',
    '[email protected]',
    48,
    sysdate-79,
    'Consultant',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    6,
    2,
    'Leo Vargas',
    '[email protected]',
    58,
    sysdate-75,
    'Engineer',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    7,
    3,
    'Verna Greene',
    '[email protected]',
    68,
    sysdate-62,
    'Architect',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    8,
    1,
    'Walter Hodges',
    '[email protected]',
    82,
    sysdate-17,
    'Consultant',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    9,
    3,
    'Franklin Nunez',
    '[email protected]',
    68,
    sysdate-95,
    'Consultant',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    10,
    4,
    'Seth Tran',
    '[email protected]',
    78,
    sysdate-25,
    'Engineer',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    11,
    1,
    'Della Page',
    '[email protected]',
    88,
    sysdate-73,
    'Manager',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    12,
    2,
    'Nicholas Harrison',
    '[email protected]',
    25,
    sysdate-29,
    'Architect',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    13,
    2,
    'Walter Lane',
    '[email protected]',
    10,
    sysdate-44,
    'Manager',
    'N/A'
);
insert into employees (
    id,
    department_id,
    name,
    email,
    cost_center,
    the_start,
    job,
    image
) values (
    14,
    3,
    'Edgar Little',
    '[email protected]',
    72,
    sysdate-38,
    'Salesman',
    'N/A'
);

commit;


-- Generated by Quick SQL 1.2.12 5/3/2024, 9:38:24 AM

/*
departments /insert 4
   name /nn
   location
   country
   employees /insert 14
      name /nn vc50
      email /lower
      cost center num
      start date date
      job vc255
      image blob
view emp_v departments employees

 Non-default options:
# settings = {}

*/

You can see the that view here is missing a ; and if you try to run this whole script, it fails with Error at line 22/1: ORA-00936: missing expression

Also the inserts ALL fail with : ORA-01465: invalid hex number

This is due to the image column being a blob and trying to insert 'N/A' into it. BLOB columns rather should just get null.

Add support for check constraints other than "value in list"

QSQL currently supports check constraints at the column level, but only for "value in list" types of constraints. There are lots of other types of constraints we might want to model in QSQL. For example, you might want to ensure values in a number column are higher or lower than a fixed value. Or you might want to ensure text going into a text column is a certain case.

Such constraints would require QSQL to allow the developer to provide an expression and (potentially) a name for the constraint.

'N/A' column value for TIMESTAMP column type

Input:

employees /insert 1
      date hired

#settings={ date:timestamp}

Output:

create table employees (
    id            number generated by default on null as identity
                  constraint employees_id_pk primary key,
    date_hired    timestamp
);



-- load data

insert into employees (
    id,
    date_hired
) values (
    1,
    'N/A'
);

commit;

Confusing that #semantics:default is required to get default semantics

It is puzzling why, by default, QuickSQL does not give me default semantics for varchar2 column sizes.

I always have to include #semantics:default in order to get the default semantics.

It would make more sense for default to be the default for semantics, and that you could use #semantics to get a non-default semantics.

Add support for table level check constraints

The /unique directive can be used at the column level or the table level (for multiple columns). However, the /check directive only works at the column level. It would be nice to see support for this directive at the table level for multiple columns as well.

Granted, a check constraint at the table level isn't going to be just a simple IN list, but a more complex expression. I'll log a separate issue for that specifically.

Add GitHub Actions to build and test the library for every PR

Acceptance Criteria:

  • GitHub Actions run whenever a PR is submitted
  • Tests are executed and if failed, the PR will be unable to be merged
  • For new feat and fix commits, GitHub Actions created a tag and a release along with the corresponding NPM package

Use key words in the column name to default the data type

Currently in quickSQL if the column name includes _number then it defaults the column to NUMBER and _date to DATE. Prior to 23.2 _name would defalt to VC255, but that has now gone. I would suggest this is reinstated, but shortened to the length at which the builder creates a Text Field rather than a Text Area. I would do the same for _title.

Implicitly, this should set test_name = vc90, test_description = vc4000, test_number = numeric and test_date = date

Screenshot 2024-01-27 at 22 17 45

The sequence deletion statement is not created if DROP='Y'

Input

# pk: SEQ
# drop: Y
students 
    name

Output

drop table students cascade constraints;
-- create tables

create sequence  students_seq;

create table students (
    id      number default on null students_seq.nextval 
            constraint students_id_pk primary key,
    name    varchar2(255 char)
);

Expected output

drop table students cascade constraints;
drop sequence  students_seq;
-- create tables

create sequence  students_seq;

create table students (
    id      number default on null students_seq.nextval 
            constraint students_id_pk primary key,
    name    varchar2(255 char)
);

Aggregate QSQL API calls into single one

Currently, QSQL API contains dependent functions. For example, the methods listing errors/warnings, or translating to ERD call the main method toDDL() which builds the parse tree. A better design would be having a single method, constructing a composite object which includes all the information: the parse tree, the list of errors, the translated SQL code, and the ERD.

QuickSQL should generate compound triggers by default

Originally, this was suggested in the APEX Ideas application: https://apexapps.oracle.com/pls/apex/r/apex_pm/ideas/details?idea=FR-3332
Below is the gist, and some of the information from the thread there.

Idea Summary

Right now QuickSQL generates non-compound triggers with names like “employees_biu” (employees before insert update) leading folks down the path of creating another trigger like “employees_aiu” (employees after insert update). Generally a table should really only have a single trigger (yes, if you are using Edition Based Redefinition you'd have two triggers). Since compound triggers were introduced in Oracle 11g, that single trigger should be a compound trigger.

Use Case

Compound triggers have many advantages over multiple triggers including global declarations and a specific firing order, but by far the biggest advantage is a single place to look for trigger logic.

When a trigger in QuickSQL is generated, it should look something like the below:

create or replace trigger <trigger-name> --example: employee_compound_trigger
  for insert or update or delete on <table-name>
    compound trigger

  -- global declarations
  -- g_global_variable varchar2(10);

  before statement is
  begin
    null; -- do something here.
  end before statement;

  before each row is
  begin
    null; -- do something here.
  end before each row;

  after each row is
  begin
    null; -- do something here.
  end after each row;

  after statement is
  begin
    null; -- do something here.
  end after statement;

end <trigger-name>;
/

There are no benefits to using separate triggers, only detriments. (My opinion.)

If you really, really want the ability to have QuickSQL generate separate triggers, you could generate compound triggers by default (as one should ;) ) and provide an option of “Add complexity to our implementation by implementing triggers as separate triggers, but realize that if you need to share state or variable information between those triggers you'll have to put that information into packages that will need to be in scope for the triggers”. Or you could just label it “Use legacy triggers”.

Steven Feuerstein chimed in with the following:

I love this idea - compound triggers have been around since 11g.

Basically, if you agree that all procs and funcs should be inside packages and not “stand-alone”, the very same argument applies to individual triggers vs compound.

Performance generating SQL with more than a few tables is not great

I use Quick SQL in APEX where there's an editor on the left and the SQL output on the right. I was recently working on a new model that had 28 tables and ~300 total columns (~11 columns per table). The model had 36 ref constraints. While not small, this isn't a very large or complex schema.

It takes Quick SQL about 2 seconds to generate the SQL for this schema. While that may not seem like much, because it blocks the main event loop, it makes interactions with input fields difficult to manage and results in a bad user experience. APEX is using a debounce, but it's not sufficient to address the problem. Anything blocking the event loop for that long will result in a bad UX.

An async API to generate the SQL would fix the problem.

"Prefix primary keys with table name" with SYSGUID error

When using the setting "Prefix primary keys with a sysguid population method, then a trigger is created that uses the :new.id rather than :new.[table name]_id. This causes an error when the script is run.

Populating the primary key in the trigger is not actually needed anyway as there is a default value on the column in the table.
Screenshot 2024-01-27 at 22 09 24

"Prefix primary keys with table name", table api error

The generated table API is not respecting the prefixPKwithTname:true setting.

For example:

# settings = {prefixPKwithTname:true, "api":"Y"}
person
  first_name 
  last_name
  first_date
addreess
  address1
  address2
  person_id

the resulting table api includes:

    begin 
        insert into addreess ( 
            id,
            person_id,
            address1,
            address2
        ) values ( 
            p_id,
            p_person_id,
            p_address1,
            p_address2
        );

the ID column should be ADDRESS_ID and I'd like to see p_id become p_address_id.

Thank you.

/insert generates unexpected 'N/A' string for a blob column & data too long for vc20

When creating a table with a blob, and requesting the generation of sample data, Quick SQL unexpectedly provides the literal string 'N/A' as the sample data value for the blob

reports /insert 1
        created date
        description vc20
        pdf blob

This produces:

insert into reports (
id,
created,
description,
pdf
) values (
1,
sysdate-70,
'Om pikawo pe amopi woem efum ji ga sefze figi pomlot dadeziguz seak nigamu luv. Boma oziworpaz re cuznim reletwaj lojeibe lolmuj tujle tovufmu vofizo nuzi regusoj iwmum busifurih weof.',
'N/A'
);

This generated insert statement has two problems. The 'N/A' produces an error ORA-01465: invalid hex number and the sample data for the vc20 column description produces an error ORA-12899: value too large for column

A workaround for the blob issue is to use the /values null on the blob column and a workaround for the vc issue is to use a /values One Value,Another Value,Something Else,Fourth Example like this:

reports /insert 1
        created date
        description vc20 /values One Value,Another Value,Something Else,Fourth Example
        pdf blob /values null

Output DDL has `undefined` for QuickSQL version

QuickSQL input:

dept
    name

The commented section in the output DDL has the following:

-- Generated by Quick SQL undefined 12/6/2023, 12:22:36 AM

/*
dept
    name

 Non-default options:
# settings = {}

*/

After -- Generated by Quick SQL it should have the QuickSQL version instead of undefined

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.