The Ax.rs.Writer function provides a mechanism to write data from a ResultSet data source to CSV, Excel, SQL, Text or commit to a database transaction. It's also known as data Export.

1 Writer function

The writer function can use distinct methods to generate output formats:

Method Resource type
csv to generate CSV files
excel to create Excel files
json to create JSON files
sql to create SQL commands to insert data
db A particular option to insert data directly into a database table. This way, the output is a database transaction.

As usual, to describe the Ax.rs.Writer function we can use the Class function.

Copy
<script>
console.log(Class.describe(Ax.rs.Writer))
console.log(Class.describe(Ax.rs.Writer, "DBExportOptions"));
</script>
class JSResultSetWriter {
                                 -- constructors
                                (*) JSResultSetWriter(IResultSetConvertible)

                                 -- methods class
                             String describe();

                                 -- methods JSResultSetWriter extends class deister.axional.server.jdbc.io.dbexport.impl.ResultSetDBExportImpl
                          DBExportStatistics getStatistics();
                          ResultSet csv(Consumer<CSVExportOptions>);
                          ResultSet csv(OutputStream);
                          ResultSet csv(File);
                          ResultSet db(Consumer<DBExportOptions>);
                          ResultSet excel(OutputStream);
                          ResultSet excel(File);
                          ResultSet excel(Consumer<ExcelExportOptions>);
                          ResultSet json(Consumer<JSONExportOptions>);
                          ResultSet sql(Consumer<SQLExportOptions>);

}

You can call getStatistics() to get a DBExportStatistics object with information about the number of rows inserted, updated, excluded and errors handled.

Copy
<script>
    Ax.db.execute("DROP TABLE IF EXISTS antibiotics");
    var rs_writer = new Ax.rs.Reader().csv(options => {
        options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/antibiotics1.csv");
    }).writer();
    
    
    var rs1 = rs_writer.db(options => {
    	options.setLogger(console.getLogger());
        options.setConnection(Ax.db.getObject());
    	options.setTableName("antibiotics");
    	options.setTableCreate(true);
    	options.setTablePrimaryKey("pk_antibiotics", "bacteria");
    }
    );
    
    // Get statistics of db writing
    var rs_stats = rs_writer.getStatistics();
    
    console.log(rs_stats);
    console.log("   Error: " + rs_stats.getErrorCount());
    console.log("  Insert: " + rs_stats.getInsertCount());
    console.log("  Update: " + rs_stats.getUpdateCount());
    console.log("Excluded: " + rs_stats.getExcludeCount());
    console.log(rs1);

</script>
10 row(s) inserted, 0 row(s) updated, 0 row(s) excluded, 0 row(s) with error

   Error: 0
  Insert: 10
  Update: 0
Excluded: 0
+-------+-------------------------------+----------+------------+--------+--------+
|Id     |Bacteria                       |Penicillin|Streptomycin|Neomycin|Gram    |
|integer|char(31)                       |double    |double      |double  |char(8) |
+-------+-------------------------------+----------+------------+--------+--------+
|      1|Mycobacterium tuberculosis     |      0.00|        5.00|    2.00|        |
|      2|Salmonella schottmuelleri      |      0.00|        0.80|    0.09|        |
|      3|Proteus vulgaris               |      3.00|        0.10|    0.10|negative|
|      4|Klebsiella pneumoniae          |    850.00|        1.20|    1.00|negative|
|      5|Brucella abortus               |      1.00|        2.00|    0.02|negative|
|      6|Pseudomonas aeruginosa         |    850.00|        2.00|    0.40|negative|
|      7|Escherichia coli               |    100.00|        0.40|    0.10|negative|
|      8|Salmonella (Eberthella) typhosa|      1.00|        0.40|    0.01|negative|
|      9|Aerobacter aerogenes           |    870.00|        1.00|    1.60|negative|
|     10|Brucella antracis              |      0.00|        0.01|    0.01|positive|
+-------+-------------------------------+----------+------------+--------+--------+

2 Writing CSV

Writing a CSV file is simple. As every writer method you should supply a configuration using a consumer (=>).

Copy
<script>
    var rs = Ax.db.executeQuery("SELECT * FROM systables");
    
    var blob = new Ax.sql.Blob("systables.csv");
    new Ax.rs.Writer(rs).csv(options => {
        //options.setFile("/tmp/systables_file.csv");
        
        options.setResource(blob);

        // Add a header for Excel to allow it recognises file as CSV
        options.setHeaderText("sep=" + options.getDelimiter());

        // Wire logger to console logger to see writer debug
        options.setLogger(console.getLogger()); 
    });
    return blob;
</script>

2.1 Describe CSVExportOptions

The options object is an instance of CSVExportOptions configurator. You can inspect object class properties by getting it's description as object related to main Writer class.

From the options below we can see we can set the charset, the CSV delimiter, the quote char and so on.

Copy
<script>
        Console.log(Class.describe(Ax.rs.Writer, "CSVExportOptions"));
</script>
class CSVExportOptions {
                                 -- constructors
                                (*) CSVExportOptions()

                                 -- methods class
                             String getHeaderText();
                               void setHeaderText(String);
                               void setOutputStream(OutputStream);
                               void setResource(OutputStream);

...
}

2.2 Formatting Decimal Output format

There are multiple methods to set up the outfor format of Decimal numbers in generated CSV files.

setNumberFormat allows to define output format of a Decimal column in a simply way. This method returns a DecimalFormat Java class, but usualy no further action is required as you specify format and locale as parameters.

A more powerful and complex method can be used by calling to setDecimalFormat as this exposes alse a DecimalFormat class you'll be able to configure completely.

Second parameter of setNumberFormat and setDecimalFormat indicates the pattern to use for output. Many characters in a pattern are taken literally; they are matched during parsing and output unchanged during formatting. Special characters, on the other hand, stand for other characters, strings, or classes of characters. They must be quoted, unless noted otherwise, if they are to appear in the prefix or suffix as literals.

The characters listed here are used in non-localized patterns. Localized patterns use the corresponding characters taken from this formatter's DecimalFormatSymbols object instead, and these characters lose their special status. Two exceptions are the currency sign and quote, which are not localized.

Symbol Location Meaning
0 Number Digit
# Number Digit, zero shows as absent
. Number Decimal separator or monetary decimal separator
- Number Minus sign
, Number Grouping separator
E Number Separates mantissa and exponent in scientific notation. Need not be quoted in prefix or suffix.
; Subpattern boundary Separates positive and negative subpatterns
% Prefix or suffix Multiply by 100 and show as percentage
Copy
<script>
    var blob = new Ax.sql.Blob("systables.csv");
    
    var rs = Ax.db.executeQuery(`
        SELECT tabname, tabid, nrows,
               1234567890.123456789 colname_double, 1234567890.123456789 col_dec1, 1234567890.123456789 col_dec2
          FROM systables
         WHERE tabid < 10`);
    
    console.log("Writing CSV of systables");
    
    new Ax.rs.Writer(rs).csv(options => {
        options.setDelimiter(";");
        options.setLogger(console.getLogger()); 
        options.setResource(blob);
        options.setHeaderText("sep=" + options.getDelimiter());
    
    
        // ===============================================================
        // The return of is setNumberFormat a DecimalFormat
        // ===============================================================
    
        var nf = options.getFormats().setNumberFormat("colname_double", "##,##", "es");
    
        // ===============================================================
        // The return of is setDecimalFormat a DecimalFormat
        // Notice that setDecimalFormat will fail if applied to non 
        // decimal.
        //
        // Cannot set decimal format for integer columns like nrows
        // java.lang.Double cannot be cast to class java.math.BigDecimal
        // ===============================================================
    
        for (var colname of ['col_dec1', 'col_dec2']) {
            var df = options.getFormats().setDecimalFormat(colname, "#,##0.00;(#,##0.00)");
            df.setMinimumFractionDigits(4);
            df.setMaximumFractionDigits(7);
            df.setGroupingSize(0);
        
            //
            // Convert US format "1,234,567,890.12345679" to ES "1.234.567.890,12345679"
            // changing DecimalFormatSymbols and setting it back to DecimalFormat
            //
            var  dfs = df.getDecimalFormatSymbols();
            dfs.setDecimalSeparator(',');
            dfs.setGroupingSeparator('.');
            
            df.setDecimalFormatSymbols(dfs);
        }
    });
    
    console.log("Content type=" + blob.getContentType());
    console.log(blob.getText());
    
    return blob;
</script>
sep=;
tabname;tabid;nrows;n_double;n_dec
systables;1;1901.000000;1.234.567.890,123;1234567890,1234568
syscolumns;2;27875.000000;1.234.567.890,123;1234567890,1234568
sysindices;3;6432.000000;1.234.567.890,123;1234567890,1234568
systabauth;4;1948.000000;1.234.567.890,123;1234567890,1234568
syscolauth;5;125.000000;1.234.567.890,123;1234567890,1234568
sysviews;6;327.000000;1.234.567.890,123;1234567890,1234568
sysusers;7;96.000000;1.234.567.890,123;1234567890,1234568
sysdepend;8;379.000000;1.234.567.890,123;1234567890,1234568
syssynonyms;9;0.000000;1.234.567.890,123;1234567890,1234568

3 Writing Excel

We have already seen write operations as subsequences of previous Read function. But, let's see how easy it is to export any database query to CSV, Excel or SQL format by simply using the Ax.rs.Writer API.

The following example exports to Excel the results of a query to database table sysmaster. The ouput (excel) is sent to a in memory blob.

Copy
<script>
    var rs = Ax.db.executeQuery("SELECT * FROM systables");
    
    var blob = new Ax.sql.Blob("systables.xls");
    new Ax.rs.Writer(rs).excel(options => {
        options.setResource(blob); 
    });
    return blob;
</script>

4 Writing JSON

You can easily generate a JSON from a ResultSet

Copy

JSON without metadata

<script>
    var blob = new Ax.sql.Blob("systables.json");
    var rs = Ax.db.executeQuery("SELECT tabid, tabname, nrows, npused FROM systables WHERE tabid < 5");
    new Ax.rs.Writer(rs).json(options => {
        options.setLogger(console.getLogger()); 
        options.setPrettyPrint(true);
        options.setResource(blob);
    
    });
    rs.close();
    console.log(new Ax.lang.String(blob.getBytes()));
    return blob;
</script>
[
  {
    "tabid": 1,
    "tabname": "systables",
    "nrows": 1873.0,
    "npused": 16.0
  },
  {
    "tabid": 2,
    "tabname": "syscolumns",
    "nrows": 27377.0,
    "npused": 108.0
  },
  {
    "tabid": 3,
    "tabname": "sysindices",
    "nrows": 6356.0,
    "npused": 152.0
  },
  {
    "tabid": 4,
    "tabname": "systabauth",
    "nrows": 1918.0,
    "npused": 10.0
  }
]
Copy

JSON with metadata

<script>
    var blob = new Ax.sql.Blob("systables.json");
    var rs = Ax.db.executeQuery("SELECT tabid, tabname, nrows, npused FROM systables WHERE tabid < 5");
    new Ax.rs.Writer(rs).json(options => {
        options.setLogger(console.getLogger()); 
        options.setPrettyPrint(true);
        options.setResource(blob);
        options.setShowMetaData(true);
    });
    rs.close();
    console.log(new Ax.lang.String(blob.getBytes()));
    return blob;
</script>
{
    "cols": [
      {
        "columnName": "tabid",
        "columnLabel": "tabid",
        "columnClassName": "java.lang.Integer",
        "columnDisplaySize": 10,
        "columnTypeName": "serial",
        "columnType": 4,
        "scale": 0,
        "precision": 10,
        "isAutoIncrement": true,
        "isNullable": 0,
        "isWritable": true,
        "isCaseSensitive": false
      },
      {
        "columnName": "tabname",
        "columnLabel": "tabname",
        "columnClassName": "java.lang.String",
        "columnDisplaySize": 128,
        "columnTypeName": "varchar",
        "columnType": 12,
        "scale": 0,
        "precision": 128,
        "isAutoIncrement": false,
        "isNullable": 1,
        "isWritable": true,
        "isCaseSensitive": false
      },
      {
        "columnName": "nrows",
        "columnLabel": "nrows",
        "columnClassName": "java.lang.Double",
        "columnDisplaySize": 17,
        "columnTypeName": "float",
        "columnType": 8,
        "scale": 0,
        "precision": 15,
        "isAutoIncrement": false,
        "isNullable": 1,
        "isWritable": true,
        "isCaseSensitive": false
      },
      {
        "columnName": "npused",
        "columnLabel": "npused",
        "columnClassName": "java.lang.Double",
        "columnDisplaySize": 17,
        "columnTypeName": "float",
        "columnType": 8,
        "scale": 0,
        "precision": 15,
        "isAutoIncrement": false,
        "isNullable": 1,
        "isWritable": true,
        "isCaseSensitive": false
      }
    ],
    "rows": [
      [
        1,
        "systables",
        1873.0,
        16.0
      ],
      [
        2,
        "syscolumns",
        27377.0,
        108.0
      ],
      [
        3,
        "sysindices",
        6356.0,
        152.0
      ],
      [
        4,
        "systabauth",
        1918.0,
        10.0
      ]
    ]
}

5 Writing SQL schema

The following example reads a CSV, transforms (adds a calculated column) and creates an SQL text file with the create table statement and the insert statements to fill table with CSV data.

Copy
<script>
    var blob = new Ax.sql.Blob("cars.sql");
    new Ax.rs.Reader().csv(options => {
        options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/cars93.csv");
        options.setExcludeColumnIndexes(0);
    }).rows().select(row => {
        let weightKG = row.getDouble("Weight") * 0.453592;
        let horsepower = row.getDouble("Horsepower");
        return horsepower / weightKG > 0.1;
    }).cols().add("MPG(Highway/City)", Ax.sql.Types.DOUBLE, v => {
        let cityMpg = v.getDouble("MPG.city");
        let highwayMpg = v.getDouble("MPG.highway");
        return highwayMpg / cityMpg;
    }).writer().sql(options => {
        options.setResource(blob);
        options.setTableName("cars");
        options.setCreateTable(true);
        // You can specify the target database driver (default is INFORMIX)
        // options.setDriver(Ax.db.getDriver());
    }
    );
    return blob;
</script>
CREATE TABLE cars (
                     manufacturer char (13) not null,
                            model char (14) not null,
                             type char (7) not null,
                        min.price double precision not null,
                            price double precision not null,
                        max.price double precision not null,
                         mpg.city integer not null,
                      mpg.highway integer not null,
                          airbags char (18) not null,
                       drivetrain char (5) not null,
                        cylinders char (6) not null,
                       enginesize double precision not null,
                       horsepower integer not null,
                              rpm integer not null,
                     rev.per.mile integer not null,
                  man.trans.avail boolean not null,
               fuel.tank.capacity double precision not null,
                       passengers integer not null,
                           length integer not null,
                        wheelbase integer not null,
                            width integer not null,
                      turn.circle integer not null,
                   rear.seat.room double precision not null,
                     luggage.room integer not null,
                           weight integer not null,
                           origin char (7) not null,
                             make char (24) not null,
                mpg(highway/city) double precision not null
);
INSERT INTO cars (Manufacturer,Model,Type,Min.Price,Price,Max.Price,MPG.city,MPG.highway,AirBags,DriveTrain,Cylinders,EngineSize,Horsepower,RPM,Rev.per.mile,Man.trans.avail,Fuel.tank.capacity,Passengers,Length,Wheelbase,Width,Turn.circle,Rear.seat.room,Luggage.room,Weight,Origin,Make,mpg(highway/city)) VALUES ('Acura','Integra','Small',12.9,15.9,18.8,25,31,'None','Front','4',1.8,140,6300,2890,true,13.2,5,177,102,68,37,26.5,11,2705,'non-USA','Acura Integra',1.24);
INSERT INTO cars (Manufacturer,Model,Type,Min.Price,Price,Max.Price,MPG.city,MPG.highway,AirBags,DriveTrain,Cylinders,EngineSize,Horsepower,RPM,Rev.per.mile,Man.trans.avail,Fuel.tank.capacity,Passengers,Length,Wheelbase,Width,Turn.circle,Rear.seat.room,Luggage.room,Weight,Origin,Make,mpg(highway/city)) VALUES ('Acura','Legend','Midsize',29.2,33.9,38.7,18,25,'Driver & Passenger','Front','6',3.2,200,5500,2335,true,18.0,5,195,115,71,38,30.0,15,3560,'non-USA','Acura Legend',1.3888888888888888);
INSERT INTO cars (Manufacturer,Model,Type,Min.Price,Price,Max.Price,MPG.city,MPG.highway,AirBags,DriveTrain,Cylinders,EngineSize,Horsepower,RPM,Rev.per.mile,Man.trans.avail,Fuel.tank.capacity,Passengers,Length,Wheelbase,Width,Turn.circle,Rear.seat.room,Luggage.room,Weight,Origin,Make,mpg(highway/city)) VALUES ('Audi','90','Compact',25.9,29.1,32.3,20,26,'Driver only','Front','6',2.8,172,5500,2280,true,16.9,5,180,102,67,37,28.0,14,3375,'non-USA','Audi 90',1.3);
INSERT INTO cars (Manufacturer,Model,Type,Min.Price,Price,Max.Price,MPG.city,MPG.highway,AirBags,DriveTrain,Cylinders,EngineSize,Horsepower,RPM,Rev.per.mile,Man.trans.avail,Fuel.tank.capacity,Passengers,Length,Wheelbase,Width,Turn.circle,Rear.seat.room,Luggage.room,Weight,Origin,Make,mpg(highway/city)) VALUES ('Audi','100','Midsize',30.8,37.7,44.6,19,26,'Driver & Passenger','Front','6',2.8,172,5500,2535,true,21.1,6,193,106,70,37,31.0,17,3405,'non-USA','Audi 100',1.368421052631579);
INSERT INTO cars (Manufacturer,Model,Type,Min.Price,Price,Max.Price,MPG.city,MPG.highway,AirBags,DriveTrain,Cylinders,EngineSize,Horsepower,RPM,Rev.per.mile,Man.trans.avail,Fuel.tank.capacity,Passengers,Length,Wheelbase,Width,Turn.circle,Rear.seat.room,Luggage.room,Weight,Origin,Make,mpg(highway/city)) VALUES ('BMW','535i','Midsize',23.7,30.0,36.2,22,30,'Driver only','Rear','4',3.5,208,5700,2545,true,21.1,4,186,109,69,39,27.0,13,3640,'non-USA','BMW 535i',1.3636363636363635);
INSERT INTO cars (Manufacturer,Model,Type,Min.Price,Price,Max.Price,MPG.city,MPG.highway,AirBags,DriveTrain,Cylinders,EngineSize,Horsepower,RPM,Rev.per.mile,Man.trans.avail,Fuel.tank.capacity,Passengers,Length,Wheelbase,Width,Turn.circle,Rear.seat.room,Luggage.room,Weight,Origin,Make,mpg(highway/city)) VALUES ('Buick','LeSabre','Large',19.9,20.8,21.7,19,28,'Driver only','Front','6',3.8,170,4800,1570,false,18.0,6,200,111,74,42,30.5,17,3470,'USA','Buick LeSabre',1.4736842105263157);
INSERT INTO cars (Manufacturer,Model,Type,Min.Price,Price,Max.Price,MPG.city,MPG.highway,AirBags,DriveTrain,Cylinders,EngineSize,Horsepower,RPM,Rev.per.mile,Man.trans.avail,Fuel.tank.capacity,Passengers,Length,Wheelbase,Width,Turn.circle,Rear.seat.room,Luggage.room,Weight,Origin,Make,mpg(highway/city)) VALUES ('Buick','Riviera','Midsize',26.3,26.3,26.3,19,27,'Driver only','Front','6',3.8,170,4800,1690,false,18.8,5,198,108,73,41,26.5,14,3495,'USA','Buick Riviera',1.4210526315789473);
INSERT INTO cars (Manufacturer,Model,Type,Min.Price,Price,Max.Price,MPG.city,MPG.highway,AirBags,DriveTrain,Cylinders,EngineSize,Horsepower,RPM,Rev.per.mile,Man.trans.avail,Fuel.tank.capacity,Passengers,Length,Wheelbase,Width,Turn.circle,Rear.seat.room,Luggage.room,Weight,Origin,Make,mpg(highway/city)) VALUES ('Cadillac','DeVille','Large',33.0,34.7,36.3,16,25,'Driver 
...

You may need to setup the target database driver to generate the appropiate database schema. It's also need to setup values for DATE and DATETIME column types on insert statements.

6 Writing SQL inserts

This feature obtain a text with the SQL INSERTS statements with the data of table to allows regenerate this content on another table with the same schema.

The query executed will be stored into a blob using the function Ax.rs.Writer(ResultSet).sql indicating the corresponding table name:

Copy
var dbsrc = Ax.db.of("db_source");
var rs    = dbsrc.executeQuery("SELECT FIRST 4 * FROM sysusers");

var blob = new Ax.sql.Blob("tmp");
new Ax.rs.Writer(rs).sql(options => {
        options.setTableName("sysusers");
        options.setResource(blob);
    }
);
rs.close();
var text = blob.getText();
console.log(text);
INSERT INTO sysusers (username,usertype,priority,password,defrole) VALUES ('informix','D',9,'','');
INSERT INTO sysusers (username,usertype,priority,password,defrole) VALUES ('public','C',5,'','');
INSERT INTO sysusers (username,usertype,priority,password,defrole) VALUES ('deister1','C',5,'','');
INSERT INTO sysusers (username,usertype,priority,password,defrole) VALUES ('deister2','R',5,'','');

The following sample save into the catalog table wic_table_object_data the insert statements to load data on the new tables created using the schema model when a database is initilized. First of all select the tables that are marked as 'MASTER' and execute the query on the functional database to obtain the data. Then convert this data in SQL Insert statements. And finally insert into wic_table_object_data (tab_data field).

Copy
<script>
    var dict = Ax.db.of("wic_new");
    console.log("Source is:" + Ax.db);
    console.log("Target is:" + dict);
    
    var tables = dict.executeQuery("SELECT tab_name FROM wic_table_object WHERE tab_label = 'MASTER'");
    for (var row of tables) {
        var tableName = row.get("tab_name");
        var rs = Ax.db.executeQuery("SELECT * FROM " + tableName);
    
        var blob = new Ax.sql.Blob("tmp");
            new Ax.rs.Writer(rs).sql(options => {
                options.setTableName(tableName);
                options.setResource(blob);
            }
        );
        rs.close();
       
        var load = blob.getText();
    
        var data =  {
            "tab_name" : tableName,
            "tab_delimiter" : 'S',
            "tab_data" : load,
        };
        dict.execute("DELETE FROM wic_table_object_data WHERE tab_name = '" + tableName + "'");
        dict.insert("wic_table_object_data", data)
    }
</script>

7 Writing to database

You can directly transfer data to database tables. To do that, you simply need to specify table name. The write function will automatically map specified column names to table columns and perform de insert operations.

You can control table creation if need. By default, writer will not create a table unles you specify it. The method setCreate(boolean create, boolean isTemp) control if table should be create before insert.

You can specify to automatically create the table if not exists and if it should be temporary.

7.1 Describe DBExportOptions

The options object is an instance of DBExportOptions configurator. You can inspect object class properties by getting it's description as object related to main Writer class.

Copy
<script>
        console.log(Class.describe(Ax.rs.Writer, "DBExportOptions"));
</script>

7.2 Batch size

You can setup batch size for insert operation using options.setBatchSize(n)

Batch size only applies to insert operations. In insert only mode (no PK present), inserts are sent to a batch insert operation.

In update-insert mode (PK present), First update row is executed and if record not found then data is inserted. This last insert operation is executed in batch mode

To disable batch insert operations you can set batch size to 0.

Copy
new Ax.rs.Reader().csv(options => {
        // Wire console logger to see debug information from reader in console
        options.setLogger(console.getLogger());
        // log every 1000 rows
        options.setLogSize(10);
        options.setDelimiter(";");
        options.setHeader(false);
        options.setFile("/tmp/data1.csv");
        options.setMemoryMapped(true);
        options.setColumnNameMapping((colName, colOrdinal) => {
            switch (colOrdinal) {
                case 0:     return "code";
                case 1:     return "name";
                default:    return colName;
            }
        });
    }).writer().db(options => {
        options.setLogger(console.getLogger());
        options.setConnection(Ax.db.getObject());
        options.setBatchSize(5000);
        options.setTableName("vsc_test");
        options.setInsertFirst(true);
    }
    );

options.setInsertFirst(true) orders to try a direct insert row by row and if a primary key error received, then update row (if setbatchsize = 0 then direct update else batch update).

7.3 Commit every

You can setup auto commit every number of rows processed to avoid long transactions by using options.setCommitSize(n)

If you've set a batch size, it's recommended to set a commit size multiple of this batch size as performing transactions commit before batch rows are flushed to server does nothing and it's a waste of time.

7.4 Insert

In the following example we will see how to load a data set and use it's column information to automatically create a new table. We have tree data sets:

  • antibiotics the complete information about antibiotics.
  • antibiotics1 partial information about antibiotics (antibiotics 1 to 10 with missing gram for 1 and 2)
  • antibiotics2 partial information about antibiotics that updates and completes previous data set. (antibiotics 11 to 16 and fixed values for 1 and 2)

Let's see how to insert the antibiotics1 data set into a new table named antibiotics.

Copy
<script>
    // Simply ensure tables does not exist for our test
    Ax.db.execute("DROP TABLE IF EXISTS antibiotics");
   
    var rs1 = new Ax.rs.Reader().csv(options => {
        options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/antibiotics1.csv");
    }).writer().db(options => {
        // wire log to console
        options.setLogger(console.getLogger()); 
        // apply operations on current database
        options.setConnection(Ax.db.getObject());
        // apply operations on table "antibiotics"
    	options.setTableName("antibiotics");
    	// create table
    	options.setTableCreate(true);
    	// create primary key named "pk_antibiotics" on columns bacteria
    	options.setTablePrimaryKeyName("pk_antibiotics");
    	options.setTablePrimaryKeyColumns("bacteria");
    }
    );
    console.log(rs1);
</script>

Log

...

DBExportModelDB [      36]: INSERT INTO antibiotics (Id,Bacteria,Penicillin,Streptomycin,Neomycin,Gram) VALUES (?,?,?,?,?,?)
DBExportModelDB [      36]: Table antibiotics will be created, temporary=false
DBExportModelDB [      42]: CREATE TABLE antibiotics (
                               id integer not null,
                         bacteria char (32) not null,
                       penicillin float not null,
                     streptomycin float not null,
                         neomycin float not null,
                             gram char (8),
 PRIMARY KEY (bacteria) CONSTRAINT pk_antibiotics
)
DBExportModelDB [      55]:  10 row(s) inserted, 0 row(s) updated, 0 row(s) excluded, 0 error(s)
+--+--------------------------------+----------+------------+--------+--------+
|Id|Bacteria                        |Penicillin|Streptomycin|Neomycin|Gram    |
+--+--------------------------------+----------+------------+--------+--------+
| 1| Mycobacterium tuberculosis     |    0.0000|      5.0000|  2.0000|        |
| 2| Salmonella schottmuelleri      |    0.0000|      0.8000|  0.0900|        |
| 3| Proteus vulgaris               |    3.0000|      0.1000|  0.1000|negative|
| 4| Klebsiella pneumoniae          |  850.0000|      1.2000|  1.0000|negative|
| 5| Brucella abortus               |    1.0000|      2.0000|  0.0200|negative|
| 6| Pseudomonas aeruginosa         |  850.0000|      2.0000|  0.4000|negative|
| 7| Escherichia coli               |  100.0000|      0.4000|  0.1000|negative|
| 8| Salmonella (Eberthella) typhosa|    1.0000|      0.4000|  0.0080|negative|
| 9| Aerobacter aerogenes           |  870.0000|      1.0000|  1.6000|negative|
|10| Brucella antracis              |    0.0010|      0.0100|  0.0070|positive|
+--+--------------------------------+----------+------------+--------+--------+

Notice that we can determine from logs that table has been created with the primary key on specified column. Also, column have the appropite type and size for the data type. And nulls are admited on columns with nulls.

7.4.1 Controlling data type

As seen from previous example, table data types and the value of not null are set according input metadata.

The data type of each column comes form input according to:

  • If source is a database resultset it takes the type according the database metadata.
  • If source is a CSV or Excel load it will be the optimum data type according input. If all columns are numeric (a integer or long if they are not floating or double if have floating point), date, time, datetime or string.

The value of not null comes from input according to:

  • If source is a database resultset it takes the not null according the database metadata.
  • If source is a CSV or Excel load it will be not null if and only if all values in column are not null.

Using options configurator you can change both type and not null column attributes. For example, if we want Neomycin admits nulls and have type DECIMAL(12, 4) we can do:

Copy
<script>

    ...
    options.setColumnType("Neomycin", Ax.sql.Types.DECIMAL, 12, 4);
    options.setColumnIsNotNull("Neomycin", false);
</script>
DBExportModelDB [      21]: CREATE TEMP TABLE antibiotics (
                               id integer not null,
                         bacteria char (31) not null,
                       penicillin float not null,
                     streptomycin float not null,
                         neomycin decimal (12,4),
                             gram char (8)
) WITH NO LOG

An table types will be setup according.

Using option setTableSource(tablename) you can create the table to load data as one of database. For example, imagine that table antibiotics is a physical table and we want to load data in a temp table with the same structure than table antibiotics

Copy
<script>

    Ax.db.execute(`
        CREATE TABLE antibiotics(Id SMALLINT, 
                                 Bacteria VARCHAR(30), 
                                 Penicillin DECIMAL(6,3),
                                 Streptomycin DECIMAL(6,3),
                                 Neomycin DECIMAL(6,3),
                                 Gram CHAR(10))
        `);
    ...
    
    // Apply operations on table "tmp_antibiotics"
    options.setTableName("tmp_antibiotics");

    // Create table as temp table
    options.setTableCreateTemp(true);

    // Create temp table as antibiotics table.
    options.setTableSource("antibiotics");

</script>
DBExportModelDB [      21]: CREATE TEMP TABLE tmp_antibiotics (
                               id smallint not null,
                         bacteria varchar (30) not null,
                       penicillin decimal (6,3) not null,
                     streptomycin decimal (6,3) not null,
                         neomycin decimal (6,3) not null,
                             gram char (10),
 PRIMARY KEY (bacteria)
) WITH NO LOG

7.5 Update or Insert (merge)

Now we can use antibiotics2 data set to do a merge on table antibiotics. Rows that match primary key will be updated while rows not present will be inserted.

If no physical primary key is present on destination table, you can simulate PK existance by invoking command: options.setTablePrimaryKeyColumns("pkcols");

If no physical primary key is present and virtual PK is defined neither, db writer will execute "insert only" operation.

Notice that previous operation has created a PRIMARY KEY on table as we need a PRIMARY KEY to update a table.
Copy
<script>
    var rs2 = new Ax.rs.Reader().csv(options => {
        options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/antibiotics2.csv");
    }).writer().db(options => {
    	options.setLogger(console.getLogger());
        options.setConnection(Ax.db.getObject());
    	options.setTableName("antibiotics");
    }
    );
    console.log(rs2);
</script>

Log

...
DBExportModelDB [      13]: INSERT INTO antibiotics (Id,Bacteria,Penicillin,Streptomycin,Neomycin,Gram) VALUES (?,?,?,?,?,?)
DBExportModelDB [      13]: Table antibiotics will updated first
DBExportModelDB [     118]: Primary key columns obtained from metadata: [bacteria]
DBExportModelDB [     119]: UPDATE antibiotics SET Id = ?,Penicillin = ?,Streptomycin = ?,Neomycin = ?,Gram = ? WHERE bacteria = ?
DBExportModelDB [     146]:  6 row(s) inserted, 2 row(s) updated, 0 row(s) excluded
+--+--------------------------------+----------+------------+--------+--------+
|Id|Bacteria                        |Penicillin|Streptomycin|Neomycin|Gram    |
+--+--------------------------------+----------+------------+--------+--------+
| 1| Mycobacterium tuberculosis     |    0.0000|      5.0000|  2.0000|        |
| 2| Salmonella schottmuelleri      |    0.0000|      0.8000|  0.0900|        |
| 3| Proteus vulgaris               |    3.0000|      0.1000|  0.1000|negative|
| 4| Klebsiella pneumoniae          |  850.0000|      1.2000|  1.0000|negative|
| 5| Brucella abortus               |    1.0000|      2.0000|  0.0200|negative|
| 6| Pseudomonas aeruginosa         |  850.0000|      2.0000|  0.4000|negative|
| 7| Escherichia coli               |  100.0000|      0.4000|  0.1000|negative|
| 8| Salmonella (Eberthella) typhosa|    1.0000|      0.4000|  0.0080|negative|
| 9| Aerobacter aerogenes           |  870.0000|      1.0000|  1.6000|negative|
|10| Brucella antracis              |    0.0010|      0.0100|  0.0070|positive|
+--+--------------------------------+----------+------------+--------+--------+

7.5.1 Comparing results

Now, we can compare both data sets. The source data set with master antibiotics data should match our database data loaed from two files.

Copy
<script>
    var rs_gen = Ax.db.executeQuery("SELECT * FROM antibiotics").toMemory();
    var rs_src = new Ax.rs.Reader().csv(options => {
        options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/antibiotics.csv");
    });
    
    console.log(rs_src);
    console.log(rs_gen);
</script>
+--+--------------------------------+----------+------------+--------+--------+
|Id|Bacteria                        |Penicillin|Streptomycin|Neomycin|Gram    |
+--+--------------------------------+----------+------------+--------+--------+
| 1| Mycobacterium tuberculosis     |  800.0000|      5.0000|  2.0000|negative|
| 2| Salmonella schottmuelleri      |   10.0000|      0.8000|  0.0900|negative|
| 3| Proteus vulgaris               |    3.0000|      0.1000|  0.1000|negative|
| 4| Klebsiella pneumoniae          |  850.0000|      1.2000|  1.0000|negative|
| 5| Brucella abortus               |    1.0000|      2.0000|  0.0200|negative|
| 6| Pseudomonas aeruginosa         |  850.0000|      2.0000|  0.4000|negative|
| 7| Escherichia coli               |  100.0000|      0.4000|  0.1000|negative|
| 8| Salmonella (Eberthella) typhosa|    1.0000|      0.4000|  0.0080|negative|
| 9| Aerobacter aerogenes           |  870.0000|      1.0000|  1.6000|negative|
|10| Brucella antracis              |    0.0010|      0.0100|  0.0070|positive|
|11| Streptococcus fecalis          |    1.0000|      1.0000|  0.1000|positive|
|12| Staphylococcus aureus          |    0.0300|      0.0300|  0.0010|positive|
|13| Staphylococcus albus           |    0.0070|      0.1000|  0.0010|positive|
|14| Streptococcus hemolyticus      |    0.0010|     14.0000| 10.0000|positive|
|15| Streptococcus viridans         |    0.0050|     10.0000| 40.0000|positive|
|16| Diplococcus pneumoniae         |    0.0050|     11.0000| 10.0000|positive|
+--+--------------------------------+----------+------------+--------+--------+

+--+--------------------------------+----------+------------+--------+--------+
|id|bacteria                        |penicillin|streptomycin|neomycin|gram    |
+--+--------------------------------+----------+------------+--------+--------+
| 1| Mycobacterium tuberculosis     |  800.0000|      5.0000|  2.0000|negative|
| 2| Salmonella schottmuelleri      |   10.0000|      0.8000|  0.0900|negative|
| 3| Proteus vulgaris               |    3.0000|      0.1000|  0.1000|negative|
| 4| Klebsiella pneumoniae          |  850.0000|      1.2000|  1.0000|negative|
| 5| Brucella abortus               |    1.0000|      2.0000|  0.0200|negative|
| 6| Pseudomonas aeruginosa         |  850.0000|      2.0000|  0.4000|negative|
| 7| Escherichia coli               |  100.0000|      0.4000|  0.1000|negative|
| 8| Salmonella (Eberthella) typhosa|    1.0000|      0.4000|  0.0080|negative|
| 9| Aerobacter aerogenes           |  870.0000|      1.0000|  1.6000|negative|
|10| Brucella antracis              |    0.0010|      0.0100|  0.0070|positive|
|11| Streptococcus fecalis          |    1.0000|      1.0000|  0.1000|positive|
|12| Staphylococcus aureus          |    0.0300|      0.0300|  0.0010|positive|
|13| Staphylococcus albus           |    0.0070|      0.1000|  0.0010|positive|
|14| Streptococcus hemolyticus      |    0.0010|     14.0000| 10.0000|positive|
|15| Streptococcus viridans         |    0.0050|     10.0000| 40.0000|positive|
|16| Diplococcus pneumoniae         |    0.0050|     11.0000| 10.0000|positive|
+--+--------------------------------+----------+------------+--------+--------+

7.6 Handling errors

We can control error handling during database insert or update operations to decide to abort operation or continue keeping some logs. For this example we will use two files:

  • particles that contains elementary physical particles except W Boson+ and W Boson-
  • particles-extra that contains 2 missing particles from previous file (W Boson) and Tau particle with null values.

7.6.1 Initial loading

We will load a physics elementary particles CSV file creating a table and setting a primery key on particle name. As loader will determine no column is null, all columns will be created as not null.

Copy
<script>
    var rs1 = new Ax.rs.Reader().csv(options => {
        options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/particles.csv");
    }).writer().db(options => {
    	options.setLogger(console.getLogger());
        options.setConnection(Ax.db.getObject());
    	options.setTableName("particles");
    	options.setTableCreate(true);
    	options.setTablePrimaryKeyName("pk_particles");
    	options.setTablePrimaryKeyColumns("name");
    }
    );
    console.log(rs1);
</script>

Log

...
DBExportModelDB [       8]: INSERT INTO particles (Id,Name,Type,Spin,Charge,Mass__MeV_c_2_) VALUES (?,?,?,?,?,?)
DBExportModelDB [       8]: Table particles will be created, temporary=false
DBExportModelDB [       9]: CREATE TABLE particles (
                               id integer not null,
                             name char (17) not null,
                             type char (11) not null,
                             spin float not null,
                           charge float not null,
                   mass__mev_c_2_ float not null,
 PRIMARY KEY (name) CONSTRAINT pk_particles
)
DBExportModelDB [      20]:  16 row(s) inserted, 0 row(s) updated, 0 row(s) excluded, 0 error(s)
+--+-----------------+-----------+------+-------+-----------+
|Id|Name             |Type       |Spin  |Charge |Mass (MeV/c|
|  |                 |           |      |       |^2)        |
+--+-----------------+-----------+------+-------+-----------+
| 1|Up               |Quark      |0.5000| 0.6667|     2.4000|
| 2|Charm            |Quark      |0.5000| 0.6667|  1270.0000|
| 3|Top              |Quark      |0.5000| 0.6667|171200.0000|
| 4|Photon           |Gauge Boson|0.0000| 0.0000|     0.0000|
| 5|Higgs Boson      |Higgs Boson|0.0000| 0.0000|127000.0000|
| 6|Down             |Quark      |0.5000|-0.3333|     4.8000|
| 7|Strange          |Quark      |0.5000|-0.3333|   104.0000|
| 8|Bottom           |Quark      |0.5000|-0.3333|  4200.0000|
| 9|Gluon            |Gauge Boson|1.0000| 0.0000|     0.0000|
|10|Electron Neutrino|Lepton     |0.5000| 0.0000|     0.0022|
|11|Muon Neutron     |Lepton     |0.5000| 0.0000|     0.0002|
|12|Tau Neutrino     |Lepton     |0.5000| 0.0000|    15.5000|
|13|Z Boson          |Gauge Boson|1.0000| 0.0000| 91200.0000|
|14|Electron         |Lepton     |0.5000|-1.0000|     0.5110|
|15|Muon             |Lepton     |0.5000|-1.0000|   105.7000|
|16|Tau              |Lepton     |0.5000|-1.0000|  1777.0000|
+--+-----------------+-----------+------+-------+-----------+

7.6.2 Additional loading

Now we can try to add (merge) the particles CSV file that contains some missing values. As columns in particles table have been created as not null, any mising value will make the load of the row fails and then, all loading process fail.

Copy
<script>
    new Ax.rs.Reader().csv(options => {
        options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/particles-extra.csv");
    }).writer().db(options => {
    	options.setLogger(console.getLogger());
        options.setConnection(Ax.db.getObject());
    	options.setTableName("particles");
    }
    );
</script>
java.lang.RuntimeException: java.sql.SQLException: Cannot insert a null into column (particles.mass__mev_c_2_).

We can check, no rows have been inserted or updated.

Copy
<script>
    console.log(Ax.db.executeQuery("SELECT id, name from particles"));
</script>
+----------+-----------------+
|id        |name             |
+----------+-----------------+
|         1|Up               |
|         2|Charm            |
|         3|Top              |
|         4|Photon           |
|         5|Higgs Boson      |
|         6|Down             |
|         7|Strange          |
|         8|Bottom           |
|         9|Gluon            |
|        10|Electron Neutrino|
|        11|Muon Neutron     |
|        12|Tau Neutrino     |
|        13|Z Boson          |
|        14|Electron         |
|        15|Muon             |
|        16|Tau              |
+----------+-----------------+

7.6.3 Error handler

Let's not put and error handler. The error handler is a function that will be called by loader on every error. Loader will pass an error object with error information and expects a return value for contiuation.

  • If true, error is ignored and load will continue.
  • If false, load process will stop and all transaction will we rolled back.

Copy
<script>
    new Ax.rs.Reader().csv(options => {
        options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/particles-extra.csv");
    }).writer().db(options => {
    	options.setLogger(console.getLogger());
        options.setConnection(Ax.db.getObject());
    	options.setTableName("particles");
    	options.setErrorHandler(error => {
    		console.log("Row     : " + error.getRow());
    		console.log("Type    : " + error.getType());
    		console.log("Data    : " + error.getData());
    		console.log("Error   : " + error.getErrorCode());
    		console.log("SQLCode : " + error.getSQLCode());
    		console.log("Message : " + error.getMessage());
    		// Continue ignoring error
    		return true;
    	});
    }
    );
</script>
Row     : 0
Type    : UPDATE
Data    : {Type=Lepton, Spin=0.5, Mass (MeV/c^2)=null, Charge=-1, Id=16, Name=Tau}
Error   : -391
SQLState: 23000
Message : Cannot insert a null into column (particles.mass__mev_c_2_).
Row     : 0
Type    : INSERT
Data    : {Type=Lepton, Spin=0.5, Mass (MeV/c^2)=null, Charge=-1, Id=16, Name=Tau}
Error   : -391
SQLState: 23000
Message : Cannot insert a null into column (particles.mass__mev_c_2_).
DBExportModelDB [      25]:  2 row(s) inserted, 0 row(s) updated, 0 row(s) excluded, 1 error(s)

The transaction has ignored the failing row (Tau particle) that has failed in both UPDATE and INSERT. But has inserted the two valid rows.

7.7 Loading cars example

The following example reads a CSV, transforms (removed first column and adds a calculated column) and inserts data into a new crated table.

Copy
<script>
    Ax.db.execute("DROP TABLE IF EXISTS cars");
    new Ax.rs.Reader().csv(options => {
        options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/cars93.csv");
        options.setExcludeColumnIndexes(0);
    }).rows().select(row => {
        let weightKG = row.getDouble("Weight") * 0.453592;
        let horsepower = row.getDouble("Horsepower");
        return horsepower / weightKG > 0.1;
    }).cols().add("MPG(Highway/City)", Ax.sql.Types.DOUBLE, v => {
        let cityMpg = v.getDouble("MPG.city");
        let highwayMpg = v.getDouble("MPG.highway");
        return highwayMpg / cityMpg;
    }).writer().db(options => {
        options.setConnection(Ax.db.getObject());
    	options.setTableName("cars");
    	// Create table as temporary
        options.setTableCreateTemp(true);

    }
    );
    return Ax.db.executeQuery("SELECT * FROM cars");
</script>

7.7.1 Setting SQL type

If not specified, SQL type of data is set automatically based on column type. This may be not addequate for some types. For example, char data type is mapped to CHAR(size) where size is the maximum column size of the column. And floating point numbers are mapped to double.

In the previous example if may want to set manufacturer as VARCHAR(40) and MPG(Highway/City) column as DECIMAL(12,4) we can do:

Copy
<script>
    
    ...
    
    // set Manufacturer
    options.setColumnType("Manufacturer", Ax.sql.Types.VARCHAR);
    options.setColumnSize("Manufacturer", 40);

     // set MPG(Highway/City) using setColumnType(type, size, scale)
    options.setColumnType("MPG(Highway/City)", Ax.sql.Types.DECIMAL, 12, 4);
</script>

Notice that mapping uses the column name as present in source and this name may change when applied to database. For example, we should use MPG(Highway/City) case sensitive to set mappings even it will be converted to mpg_highway_city_ when it's created as column in the table.

8 Debugging

Both Ax.rs.Reader and Ax.rs.Writer have a debug sytem to enable monitoring the process.

You can wire log from Ax.rs.Writer to console. This will redirect debug to javascript console so you can analize data mapping, table creation, etc. To do that, you need to connect the console logger to the options on the operation you want to debug.

For example, to debug the write method on the previous example, simply add the following call to options configurator.

Copy
options.setLogger(console.getLogger());
00:00:00.704:           DBExportModelDB - Options......: deister.axional.server.jdbc.io.options.DB.DBExportOptions@21f016f0
00:00:00.704:           DBExportModelDB - Columns(in)..: 28
00:00:00.704:           DBExportModelDB - ColumnNames..: [Manufacturer, Model, Type, Min.Price, Price, Max.Price, MPG.city, MPG.highway, AirBags, DriveTrain, Cylinders, EngineSize, Horsepower, RPM, Rev.per.mile, Man.trans.avail, Fuel.tank.capacity, Passengers, Length, Wheelbase, Width, Turn.circle, Rear.seat.room, Luggage.room, Weight, Origin, Make, MPG(Highway/City)]
00:00:00.705:           DBExportModelDB - Columns(out).: 28
00:00:00.705:           DBExportModelDB - ColumnNames..: [Manufacturer, Model, Type, Min.Price, Price, Max.Price, MPG.city, MPG.highway, AirBags, DriveTrain, Cylinders, EngineSize, Horsepower, RPM, Rev.per.mile, Man.trans.avail, Fuel.tank.capacity, Passengers, Length, Wheelbase, Width, Turn.circle, Rear.seat.room, Luggage.room, Weight, Origin, Make, MPG(Highway/City)]
00:00:00.705:           DBExportModelDB - 
+------------------+-----+-------+--------+----+-----+-------+-------+
|Name              |Index|SQLType|JavaType|Size|Scale|NotNull|Printer|
+------------------+-----+-------+--------+----+-----+-------+-------+
|Manufacturer      |    0|      1|CHAR    |  13|    0|true   |       |
|Model             |    1|      1|CHAR    |  14|    0|true   |       |
|Type              |    2|      1|CHAR    |   7|    0|true   |       |
|Min.Price         |    3|      8|DOUBLE  |   0|    0|true   |       |
|Price             |    4|      8|DOUBLE  |   0|    0|true   |       |
|Max.Price         |    5|      8|DOUBLE  |   0|    0|true   |       |
|MPG.city          |    6|      4|INTEGER |   0|    0|true   |       |
|MPG.highway       |    7|      4|INTEGER |   0|    0|true   |       |
|AirBags           |    8|      1|CHAR    |  18|    0|true   |       |
|DriveTrain        |    9|      1|CHAR    |   5|    0|true   |       |
|Cylinders         |   10|      1|CHAR    |   6|    0|true   |       |
|EngineSize        |   11|      8|DOUBLE  |   0|    0|true   |       |
|Horsepower        |   12|      4|INTEGER |   0|    0|true   |       |
|RPM               |   13|      4|INTEGER |   0|    0|true   |       |
|Rev.per.mile      |   14|      4|INTEGER |   0|    0|true   |       |
|Man.trans.avail   |   15|     16|BOOLEAN |   0|    0|true   |       |
|Fuel.tank.capacity|   16|      8|DOUBLE  |   0|    0|true   |       |
|Passengers        |   17|      4|INTEGER |   0|    0|true   |       |
|Length            |   18|      4|INTEGER |   0|    0|true   |       |
|Wheelbase         |   19|      4|INTEGER |   0|    0|true   |       |
|Width             |   20|      4|INTEGER |   0|    0|true   |       |
|Turn.circle       |   21|      4|INTEGER |   0|    0|true   |       |
|Rear.seat.room    |   22|      8|DOUBLE  |   0|    0|false  |       |
|Luggage.room      |   23|      4|INTEGER |   0|    0|false  |       |
|Weight            |   24|      4|INTEGER |   0|    0|true   |       |
|Origin            |   25|      1|CHAR    |   7|    0|true   |       |
|Make              |   26|      1|CHAR    |  24|    0|true   |       |
|MPG(Highway/City) |   27|      8|DOUBLE  |   0|    0|false  |       |
+------------------+-----+-------+--------+----+-----+-------+-------+

00:00:00.706:           DBExportModelDB - INSERT INTO cars (Manufacturer,Model,Type,Min_Price,Price,Max_Price,MPG_city,MPG_highway,AirBags,DriveTrain,Cylinders,EngineSize,Horsepower,RPM,Rev_per_mile,Man_trans_avail,Fuel_tank_capacity,Passengers,Length,Wheelbase,Width,Turn_circle,Rear_seat_room,Luggage_room,Weight,Origin,Make,MPG_Highway_City_) VALUES (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)
00:00:00.707:           DBExportModelDB - Table cars will be created, temporary=true
00:00:00.709:           DBExportModelDB - CREATE TEMP TABLE cars (
                     manufacturer char (13) not null,
                            model char (14) not null,
                             type char (7) not null,
                        min_price float not null,
                            price float not null,
                        max_price float not null,
                         mpg_city integer not null,
                      mpg_highway integer not null,
                          airbags char (18) not null,
                       drivetrain char (5) not null,
                        cylinders char (6) not null,
                       enginesize float not null,
                       horsepower integer not null,
                              rpm integer not null,
                     rev_per_mile integer not null,
                  man_trans_avail boolean not null,
               fuel_tank_capacity float not null,
                       passengers integer not null,
                           length integer not null,
                        wheelbase integer not null,
                            width integer not null,
                      turn_circle integer not null,
                   rear_seat_room float,
                     luggage_room integer,
                           weight integer not null,
                           origin char (7) not null,
                             make char (24) not null,
                mpg_highway_city_ float
) WITH NO LOG
00:00:00.729:           DBExportModelDB -  40 row(s) inserted, 0 row(s) updated, 0 row(s) excluded, 0 error(s)