Bulk Loading into infiniDB


The bulk loader from infinDB is pretty fasty, I loaded 1.8 million records in 62 seconds not bad going.  Their PDF instructions are pretty comprehensive so here is just a snippet to get you started and having a go.

 

You must first create an XML file based on the table you want to load to, then you issue the import command. EASY 🙂

This is based on a standard install in the default locations

cd /usr/local/Calpont/bin/

./colxml schema_name -t table_name -d ‘delimiter’ -l textfilename_that_will_be_imported -j job_id

e.g.

./colxml pentaho -t dim_customer_details_tab -d ‘;’ -l customer_details.txt -j 1000

 

The text file to be imported must resided in the fillowing area

/usr/local/Calpont/data/bulk/data/import

 

To then import the file

cd /usr/local/Calpont/bin/

./cpimport -j Job_Id from above

e.g.

./cpimport -j 1000

 

 

Advertisements


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s