The Ax.rs.Writer function provides a mechanism to write data from a ResultSet data source to CSV, Excel, SQL, Text or commit to a database transaction. It's also known as data Export.

# 1 Writer function

The writer function can use distinct methods to generate output formats:

Method Resource type
csv to generate CSV files
excel to create Excel files
json to create JSON files
sql to create SQL commands to insert data
db A particular option to insert data directly into a database table. This way, the output is a database transaction.

As usual, to describe the Ax.rs.Writer function we can use the Class function.

Copy
<script>
console.log(Class.describe(Ax.rs.Writer))
console.log(Class.describe(Ax.rs.Writer, "DBExportOptions"));
</script>

You can call getStatistics() to get a DBExportStatistics object with information about the number of rows inserted, updated, excluded and errors handled.

Copy
<script>
Ax.db.execute("DROP TABLE IF EXISTS antibiotics");
var rs_writer = new Ax.rs.Reader().csv(options => {
options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/antibiotics1.csv");
}).writer();

var rs1 = rs_writer.db(options => {
options.setLogger(console.getLogger());
options.setConnection(Ax.db.getObject());
options.setTableName("antibiotics");
options.setTableCreate(true);
options.setTablePrimaryKey("pk_antibiotics", "bacteria");
}
);

// Get statistics of db writing
var rs_stats = rs_writer.getStatistics();

console.log(rs_stats);
console.log("   Error: " + rs_stats.getErrorCount());
console.log("  Insert: " + rs_stats.getInsertCount());
console.log("  Update: " + rs_stats.getUpdateCount());
console.log("Excluded: " + rs_stats.getExcludeCount());
console.log(rs1);

</script>

# 2 Writing CSV

Writing a CSV file is simple. As every writer method you should supply a configuration using a consumer (=>).

Copy
<script>
var rs = Ax.db.executeQuery("SELECT * FROM systables");

var blob = new Ax.sql.Blob("systables.csv");
new Ax.rs.Writer(rs).csv(options => {
//options.setFile("/tmp/systables_file.csv");

options.setResource(blob);

// Add a header for Excel to allow it recognises file as CSV

// Wire logger to console logger to see writer debug
options.setLogger(console.getLogger());
});
return blob;
</script>

## 2.1 Describe CSVExportOptions

The options object is an instance of CSVExportOptions configurator. You can inspect object class properties by getting it's description as object related to main Writer class.

From the options below we can see we can set the charset, the CSV delimiter, the quote char and so on.

Copy
<script>
Console.log(Class.describe(Ax.rs.Writer, "CSVExportOptions"));
</script>

## 2.2 Formatting Decimal Output format

There are multiple methods to set up the outfor format of Decimal numbers in generated CSV files.

setNumberFormat allows to define output format of a Decimal column in a simply way. This method returns a DecimalFormat Java class, but usualy no further action is required as you specify format and locale as parameters.

A more powerful and complex method can be used by calling to setDecimalFormat as this exposes alse a DecimalFormat class you'll be able to configure completely.

Second parameter of setNumberFormat and setDecimalFormat indicates the pattern to use for output. Many characters in a pattern are taken literally; they are matched during parsing and output unchanged during formatting. Special characters, on the other hand, stand for other characters, strings, or classes of characters. They must be quoted, unless noted otherwise, if they are to appear in the prefix or suffix as literals.

The characters listed here are used in non-localized patterns. Localized patterns use the corresponding characters taken from this formatter's DecimalFormatSymbols object instead, and these characters lose their special status. Two exceptions are the currency sign and quote, which are not localized.

Symbol Location Meaning
0 Number Digit
# Number Digit, zero shows as absent
. Number Decimal separator or monetary decimal separator
- Number Minus sign
, Number Grouping separator
E Number Separates mantissa and exponent in scientific notation. Need not be quoted in prefix or suffix.
; Subpattern boundary Separates positive and negative subpatterns
% Prefix or suffix Multiply by 100 and show as percentage
Copy
<script>
var blob = new Ax.sql.Blob("systables.csv");

var rs = Ax.db.executeQuery(
SELECT tabname, tabid, nrows,
1234567890.123456789 colname_double, 1234567890.123456789 col_dec1, 1234567890.123456789 col_dec2
FROM systables
WHERE tabid < 10);

console.log("Writing CSV of systables");

new Ax.rs.Writer(rs).csv(options => {
options.setDelimiter(";");
options.setLogger(console.getLogger());
options.setResource(blob);

// ===============================================================
// The return of is setNumberFormat a DecimalFormat
// ===============================================================

var nf = options.getFormats().setNumberFormat("colname_double", "##,##", "es");

// ===============================================================
// The return of is setDecimalFormat a DecimalFormat
// Notice that setDecimalFormat will fail if applied to non
// decimal.
//
// Cannot set decimal format for integer columns like nrows
// java.lang.Double cannot be cast to class java.math.BigDecimal
// ===============================================================

for (var colname of ['col_dec1', 'col_dec2']) {
var df = options.getFormats().setDecimalFormat(colname, "#,##0.00;(#,##0.00)");
df.setMinimumFractionDigits(4);
df.setMaximumFractionDigits(7);
df.setGroupingSize(0);

//
// Convert US format "1,234,567,890.12345679" to ES "1.234.567.890,12345679"
// changing DecimalFormatSymbols and setting it back to DecimalFormat
//
var  dfs = df.getDecimalFormatSymbols();
dfs.setDecimalSeparator(',');
dfs.setGroupingSeparator('.');

df.setDecimalFormatSymbols(dfs);
}
});

console.log("Content type=" + blob.getContentType());
console.log(blob.getText());

return blob;
</script>

# 3 Writing Excel

We have already seen write operations as subsequences of previous Read function. But, let's see how easy it is to export any database query to CSV, Excel or SQL format by simply using the Ax.rs.Writer API.

The following example exports to Excel the results of a query to database table sysmaster. The ouput (excel) is sent to a in memory blob.

Copy
<script>
var rs = Ax.db.executeQuery("SELECT * FROM systables");

var blob = new Ax.sql.Blob("systables.xls");
new Ax.rs.Writer(rs).excel(options => {
options.setResource(blob);
});
return blob;
</script>

# 4 Writing JSON

You can easily generate a JSON from a ResultSet

Copy

<script>
var blob = new Ax.sql.Blob("systables.json");
var rs = Ax.db.executeQuery("SELECT tabid, tabname, nrows, npused FROM systables WHERE tabid < 5");
new Ax.rs.Writer(rs).json(options => {
options.setLogger(console.getLogger());
options.setPrettyPrint(true);
options.setResource(blob);

});
rs.close();
console.log(new Ax.lang.String(blob.getBytes()));
return blob;
</script>
Copy

<script>
var blob = new Ax.sql.Blob("systables.json");
var rs = Ax.db.executeQuery("SELECT tabid, tabname, nrows, npused FROM systables WHERE tabid < 5");
new Ax.rs.Writer(rs).json(options => {
options.setLogger(console.getLogger());
options.setPrettyPrint(true);
options.setResource(blob);
});
rs.close();
console.log(new Ax.lang.String(blob.getBytes()));
return blob;
</script>

# 5 Writing SQL schema

The following example reads a CSV, transforms (adds a calculated column) and creates an SQL text file with the create table statement and the insert statements to fill table with CSV data.

Copy
<script>
var blob = new Ax.sql.Blob("cars.sql");
options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/cars93.csv");
options.setExcludeColumnIndexes(0);
}).rows().select(row => {
let weightKG = row.getDouble("Weight") * 0.453592;
let horsepower = row.getDouble("Horsepower");
return horsepower / weightKG > 0.1;
let cityMpg = v.getDouble("MPG.city");
let highwayMpg = v.getDouble("MPG.highway");
return highwayMpg / cityMpg;
}).writer().sql(options => {
options.setResource(blob);
options.setTableName("cars");
options.setCreateTable(true);
// You can specify the target database driver (default is INFORMIX)
// options.setDriver(Ax.db.getDriver());
}
);
return blob;
</script>

You may need to setup the target database driver to generate the appropiate database schema. It's also need to setup values for DATE and DATETIME column types on insert statements.

# 6 Writing SQL inserts

This feature obtain a text with the SQL INSERTS statements with the data of table to allows regenerate this content on another table with the same schema.

The query executed will be stored into a blob using the function Ax.rs.Writer(ResultSet).sql indicating the corresponding table name:

Copy
var dbsrc = Ax.db.of("db_source");
var rs    = dbsrc.executeQuery("SELECT FIRST 4 * FROM sysusers");

var blob = new Ax.sql.Blob("tmp");
new Ax.rs.Writer(rs).sql(options => {
options.setTableName("sysusers");
options.setResource(blob);
}
);
rs.close();
var text = blob.getText();
console.log(text);

The following sample save into the catalog table wic_table_object_data the insert statements to load data on the new tables created using the schema model when a database is initilized. First of all select the tables that are marked as 'MASTER' and execute the query on the functional database to obtain the data. Then convert this data in SQL Insert statements. And finally insert into wic_table_object_data (tab_data field).

Copy
<script>
var dict = Ax.db.of("wic_new");
console.log("Source is:" + Ax.db);
console.log("Target is:" + dict);

var tables = dict.executeQuery("SELECT tab_name FROM wic_table_object WHERE tab_label = 'MASTER'");
for (var row of tables) {
var tableName = row.get("tab_name");
var rs = Ax.db.executeQuery("SELECT * FROM " + tableName);

var blob = new Ax.sql.Blob("tmp");
new Ax.rs.Writer(rs).sql(options => {
options.setTableName(tableName);
options.setResource(blob);
}
);
rs.close();

var data =  {
"tab_name" : tableName,
"tab_delimiter" : 'S',
};
dict.execute("DELETE FROM wic_table_object_data WHERE tab_name = '" + tableName + "'");
dict.insert("wic_table_object_data", data)
}
</script>

# 7 Writing to database

You can directly transfer data to database tables. To do that, you simply need to specify table name. The write function will automatically map specified column names to table columns and perform de insert operations.

You can control table creation if need. By default, writer will not create a table unles you specify it. The method setCreate(boolean create, boolean isTemp) control if table should be create before insert.

You can specify to automatically create the table if not exists and if it should be temporary.

## 7.1 Describe DBExportOptions

The options object is an instance of DBExportOptions configurator. You can inspect object class properties by getting it's description as object related to main Writer class.

Copy
<script>
console.log(Class.describe(Ax.rs.Writer, "DBExportOptions"));
</script>

## 7.2 Batch size

You can setup batch size for insert operation using options.setBatchSize(n)

Batch size only applies to insert operations. In insert only mode (no PK present), inserts are sent to a batch insert operation.

In update-insert mode (PK present), First update row is executed and if record not found then data is inserted. This last insert operation is executed in batch mode

To disable batch insert operations you can set batch size to 0.

Copy
new Ax.rs.Reader().csv(options => {
// Wire console logger to see debug information from reader in console
options.setLogger(console.getLogger());
// log every 1000 rows
options.setLogSize(10);
options.setDelimiter(";");
options.setFile("/tmp/data1.csv");
options.setMemoryMapped(true);
options.setColumnNameMapping((colName, colOrdinal) => {
switch (colOrdinal) {
case 0:     return "code";
case 1:     return "name";
default:    return colName;
}
});
}).writer().db(options => {
options.setLogger(console.getLogger());
options.setConnection(Ax.db.getObject());
options.setBatchSize(5000);
options.setTableName("vsc_test");
options.setInsertFirst(true);
}
);

options.setInsertFirst(true) orders to try a direct insert row by row and if a primary key error received, then update row (if setbatchsize = 0 then direct update else batch update).

## 7.3 Commit every

You can setup auto commit every number of rows processed to avoid long transactions by using options.setCommitSize(n)

If you've set a batch size, it's recommended to set a commit size multiple of this batch size as performing transactions commit before batch rows are flushed to server does nothing and it's a waste of time.

## 7.4 Insert

In the following example we will see how to load a data set and use it's column information to automatically create a new table. We have tree data sets:

• antibiotics the complete information about antibiotics.
• antibiotics1 partial information about antibiotics (antibiotics 1 to 10 with missing gram for 1 and 2)
• antibiotics2 partial information about antibiotics that updates and completes previous data set. (antibiotics 11 to 16 and fixed values for 1 and 2)

Let's see how to insert the antibiotics1 data set into a new table named antibiotics.

Copy
<script>
// Simply ensure tables does not exist for our test
Ax.db.execute("DROP TABLE IF EXISTS antibiotics");

var rs1 = new Ax.rs.Reader().csv(options => {
options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/antibiotics1.csv");
}).writer().db(options => {
// wire log to console
options.setLogger(console.getLogger());
// apply operations on current database
options.setConnection(Ax.db.getObject());
// apply operations on table "antibiotics"
options.setTableName("antibiotics");
// create table
options.setTableCreate(true);
// create primary key named "pk_antibiotics" on columns bacteria
options.setTablePrimaryKeyName("pk_antibiotics");
options.setTablePrimaryKeyColumns("bacteria");
}
);
console.log(rs1);
</script>

Notice that we can determine from logs that table has been created with the primary key on specified column. Also, column have the appropite type and size for the data type. And nulls are admited on columns with nulls.

### 7.4.1 Controlling data type

As seen from previous example, table data types and the value of not null are set according input metadata.

The data type of each column comes form input according to:

• If source is a database resultset it takes the type according the database metadata.
• If source is a CSV or Excel load it will be the optimum data type according input. If all columns are numeric (a integer or long if they are not floating or double if have floating point), date, time, datetime or string.

The value of not null comes from input according to:

• If source is a database resultset it takes the not null according the database metadata.
• If source is a CSV or Excel load it will be not null if and only if all values in column are not null.

Using options configurator you can change both type and not null column attributes. For example, if we want Neomycin admits nulls and have type DECIMAL(12, 4) we can do:

Copy
<script>

...
options.setColumnType("Neomycin", Ax.sql.Types.DECIMAL, 12, 4);
options.setColumnIsNotNull("Neomycin", false);
</script>

An table types will be setup according.

Using option setTableSource(tablename) you can create the table to load data as one of database. For example, imagine that table antibiotics is a physical table and we want to load data in a temp table with the same structure than table antibiotics

Copy
<script>

Ax.db.execute(
CREATE TABLE antibiotics(Id SMALLINT,
Bacteria VARCHAR(30),
Penicillin DECIMAL(6,3),
Streptomycin DECIMAL(6,3),
Neomycin DECIMAL(6,3),
Gram CHAR(10))
);
...

// Apply operations on table "tmp_antibiotics"
options.setTableName("tmp_antibiotics");

// Create table as temp table
options.setTableCreateTemp(true);

// Create temp table as antibiotics table.
options.setTableSource("antibiotics");

</script>

## 7.5 Update or Insert (merge)

Now we can use antibiotics2 data set to do a merge on table antibiotics. Rows that match primary key will be updated while rows not present will be inserted.

If no physical primary key is present on destination table, you can simulate PK existance by invoking command: options.setTablePrimaryKeyColumns("pkcols");

If no physical primary key is present and virtual PK is defined neither, db writer will execute "insert only" operation.

Notice that previous operation has created a PRIMARY KEY on table as we need a PRIMARY KEY to update a table.
Copy
<script>
var rs2 = new Ax.rs.Reader().csv(options => {
options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/antibiotics2.csv");
}).writer().db(options => {
options.setLogger(console.getLogger());
options.setConnection(Ax.db.getObject());
options.setTableName("antibiotics");
}
);
console.log(rs2);
</script>

### 7.5.1 Comparing results

Now, we can compare both data sets. The source data set with master antibiotics data should match our database data loaed from two files.

Copy
<script>
var rs_gen = Ax.db.executeQuery("SELECT * FROM antibiotics").toMemory();
var rs_src = new Ax.rs.Reader().csv(options => {
options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/antibiotics.csv");
});

console.log(rs_src);
console.log(rs_gen);
</script>

## 7.6 Handling errors

We can control error handling during database insert or update operations to decide to abort operation or continue keeping some logs. For this example we will use two files:

• particles that contains elementary physical particles except W Boson+ and W Boson-
• particles-extra that contains 2 missing particles from previous file (W Boson) and Tau particle with null values.

We will load a physics elementary particles CSV file creating a table and setting a primery key on particle name. As loader will determine no column is null, all columns will be created as not null.

Copy
<script>
var rs1 = new Ax.rs.Reader().csv(options => {
options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/particles.csv");
}).writer().db(options => {
options.setLogger(console.getLogger());
options.setConnection(Ax.db.getObject());
options.setTableName("particles");
options.setTableCreate(true);
options.setTablePrimaryKeyName("pk_particles");
options.setTablePrimaryKeyColumns("name");
}
);
console.log(rs1);
</script>

Now we can try to add (merge) the particles CSV file that contains some missing values. As columns in particles table have been created as not null, any mising value will make the load of the row fails and then, all loading process fail.

Copy
<script>
options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/particles-extra.csv");
}).writer().db(options => {
options.setLogger(console.getLogger());
options.setConnection(Ax.db.getObject());
options.setTableName("particles");
}
);
</script>

We can check, no rows have been inserted or updated.

Copy
<script>
console.log(Ax.db.executeQuery("SELECT id, name from particles"));
</script>

### 7.6.3 Error handler

Let's not put and error handler. The error handler is a function that will be called by loader on every error. Loader will pass an error object with error information and expects a return value for contiuation.

• If true, error is ignored and load will continue.
• If false, load process will stop and all transaction will we rolled back.

Copy
<script>
options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/particles-extra.csv");
}).writer().db(options => {
options.setLogger(console.getLogger());
options.setConnection(Ax.db.getObject());
options.setTableName("particles");
options.setErrorHandler(error => {
console.log("Row     : " + error.getRow());
console.log("Type    : " + error.getType());
console.log("Data    : " + error.getData());
console.log("Error   : " + error.getErrorCode());
console.log("SQLCode : " + error.getSQLCode());
console.log("Message : " + error.getMessage());
// Continue ignoring error
return true;
});
}
);
</script>

The transaction has ignored the failing row (Tau particle) that has failed in both UPDATE and INSERT. But has inserted the two valid rows.

The following example reads a CSV, transforms (removed first column and adds a calculated column) and inserts data into a new crated table.

Copy
<script>
Ax.db.execute("DROP TABLE IF EXISTS cars");
options.setResource("https://bitbucket.org/deister/axional-docs-resources/raw/master/CSV/cars93.csv");
options.setExcludeColumnIndexes(0);
}).rows().select(row => {
let weightKG = row.getDouble("Weight") * 0.453592;
let horsepower = row.getDouble("Horsepower");
return horsepower / weightKG > 0.1;
let cityMpg = v.getDouble("MPG.city");
let highwayMpg = v.getDouble("MPG.highway");
return highwayMpg / cityMpg;
}).writer().db(options => {
options.setConnection(Ax.db.getObject());
options.setTableName("cars");
// Create table as temporary
options.setTableCreateTemp(true);

}
);
return Ax.db.executeQuery("SELECT * FROM cars");
</script>

### 7.7.1 Setting SQL type

If not specified, SQL type of data is set automatically based on column type. This may be not addequate for some types. For example, char data type is mapped to CHAR(size) where size is the maximum column size of the column. And floating point numbers are mapped to double.

In the previous example if may want to set manufacturer as VARCHAR(40) and MPG(Highway/City) column as DECIMAL(12,4) we can do:

Copy
<script>

...

// set Manufacturer
options.setColumnType("Manufacturer", Ax.sql.Types.VARCHAR);
options.setColumnSize("Manufacturer", 40);

// set MPG(Highway/City) using setColumnType(type, size, scale)
options.setColumnType("MPG(Highway/City)", Ax.sql.Types.DECIMAL, 12, 4);
</script>

Notice that mapping uses the column name as present in source and this name may change when applied to database. For example, we should use MPG(Highway/City) case sensitive to set mappings even it will be converted to mpg_highway_city_ when it's created as column in the table.

# 8 Debugging

Both Ax.rs.Reader and Ax.rs.Writer have a debug sytem to enable monitoring the process.

You can wire log from Ax.rs.Writer to console. This will redirect debug to javascript console so you can analize data mapping, table creation, etc. To do that, you need to connect the console logger to the options on the operation you want to debug.

For example, to debug the write method on the previous example, simply add the following call to options configurator.

Copy
options.setLogger(console.getLogger());