cancel
Showing results for 
Search instead for 
Did you mean: 

Problem with trigger on large varchar field

Former Member
0 Kudos

Hello,

I've noticed a strange problem with triggers. I have two tables, first with current data (REQUEST) and the other for change log on the first table :(TRC_REQUEST). The change log contains the same fields as the original table plus fields for User (who has made a change), date and time (when the change took place) and type of the change (Insert, Update, Delete).

Each time my application makes a change in REQUEST table, the triggers defined on the REQUEST table inserts a new record in the change log.

The problem occurs when the original table contains a varchar field longer the 4000 - we've made several tests and it looks like the only reason. When the filed (for description) is shorter or equal to 4000 then everything works as expected. If however we alter it to longer than 4000 then every operation on the original table takes very long and makes the temporary data space grow to infinity (and it finally freezes the dabase when it reaches the maximum data space for our database).

Could anyone confirm this is a bug? For us, the only workaround is to cut the field down to 4000. Of course, our database is non-UNICODE (although the catalog is) so the sinlge row can have the maximum size of 8000.

We've noticed this behaviour after migrating (with the help of loader) our database from 7.4. - with the old version the problem did not exist.

Best regards,

Marcin Pytel

Edited by: Marcin Pytel on Apr 12, 2010 9:03 AM

We use version 7.7.06.09

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

Hallo,

thank you for posting this.

Will be corrected as soon as possible.

For explanation:

internally the ASCII-data is converted to UNICODE. And if the internal length needed for storing data of maximum length exceeds 80xy bytes, then the problem occurs.

Therefore there is a limit of 40yz ASCII-character usable as trigger-'parameter'.

This is true in 7.7.

In 7.8 and newer versions the maximum internal length which can be handled has been increased and even CHAR (8000) ASCII-columns can be used as trigger-parameter.

In 7.7 there will be a check included during create trigger to avoid such trouble.

Elke

lbreddemann
Active Contributor
0 Kudos

Hi there,

would you mind to provide a reproducible example for that?

Please post the DDL for the tables and triggers and the DML that leads to the error.

regards,

Lars

Former Member
0 Kudos

Of course,

Here it goes:

CREATE TABLE "TALD_ADMIN"."EXAMPLE"

(

"REC_ID" Fixed (15,0) NOT NULL DEFAULT SERIAL(1),

"EXA_DESC" Varchar (5000),

PRIMARY KEY ("REC_ID")

)

//

CREATE TABLE "TALD_ADMIN"."TRC_EXAMPLE"

(

"REC_ID" Fixed (15,0),

"EXA_DESC" Varchar (5000),

"CREATE_DATE" Date DEFAULT DATE,

"CREATE_TIME" Time DEFAULT TIME,

"OPERATION_ID" Varchar (5),

"USER_ID" Varchar (50) DEFAULT USER,

PRIMARY KEY ("REC_ID")

)

//

CREATE TRIGGER TRG_TRC_EXAMPLE_INS FOR TALD_ADMIN.EXAMPLE AFTER INSERT EXECUTE (

TRY

INSERT INTO TALD_ADMIN.TRC_EXAMPLE(OPERATION_ID,REC_ID,EXA_DESC)

VALUES ('I',:NEW.REC_ID,:NEW.EXA_DESC);

CATCH

IF $rc <> 100

THEN STOP ($rc, 'Trigger Error - AFTER INSERT OF EXAMPLE');

)

//

INSERT INTO EXAMPLE(EXA_DESC)VALUES('Test example table and trigger')

The last statement hangs my database (because data space is growing to infinity).

Thank you for your interest,

Marcin

lbreddemann
Active Contributor
0 Kudos

Hi Marcin,

I was able to reproduce this and informed our development.

Thanks for pointing out.

regards,

Lars