Page 1 of 1

TIMPORT problem - Too much data in a field?!?

Posted: Fri Feb 02, 2007 1:50 pm
by barry.marcus
What would cause this kind of message from TIMPORT:

200 Reading schema /resource/schema.scm
800 Inserting default "id" field into table
800 Statement: "insert into TEMP values(counter,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?);"
200 Connecting to server
200 Opening database /texis/camp
200 Verifying schema
200 Loading data
800 File: "/scratch/2.timport"
100 no end delimiter located within buffer In the function: freadex
100 record 0 from /scratch/2.timport truncated at file offset 1048576
204 2 records added
204 2.0 records/sec

I've double and triple checked... The delimiters are correct. There is a single record in the file. You should know that the delimited file for this single record is 1314KB. I have many identically *structured* files, some with multiple records per file, and they all import correctly. The single record in this one file is one of the biggest records that I have.

Is there a way to get TIMPORT to tolerate this much data?

Thanks in advance.

TIMPORT problem - Too much data in a field?!?

Posted: Fri Feb 02, 2007 3:13 pm
by mark
Set a larger "recsize" in your schema.

http://www.thunderstone.com/site/texism ... ort-2.html
recsize sets the size of the maximum readable record when using
recdelim or readexpr. The default is 1 megabyte.