cancel
Showing results for 
Search instead for 
Did you mean: 

How to migrate from ascii to unicode (MaxDB 7.5)? loadercli: ERR -25347

Former Member
0 Kudos

Hi,

I use MaxDB 7.5.00.26. (Ok, I know that I should switch to 7.6, however, it is not possilble for some customer restriction for now, but should be possible quite soon).

We'd like to migrate a db from ascii to unicode. Based on the infos in the thread "" I tried the following:

Export sourcedb

1. Export catalog and data

C:\> loadercli -d db_asc -u dba,dba

loadercli> export db catalog outstream file 'C:\tmp1\20080702a_dbAsc.catalog' ddl

OK

loadercli> export db data outstream file 'C:\tmp1\20080702b_dbAsc.data' pages

OK

loadercli> exit

Import targetdb

1. Create a new empty DB with '_UNICODE=yes'

2. Set 'columncompression' to 'no'

C:\> dbmcli -d db_uni -u dba,dba param_directput columncompression no

ERR -24979,ERR_XPNOTFOUND: parameter not found

Couldn't find this parameter e.g. in dbmgui (parameters general, extended and support)

3. Import catalog and data

C:\> loadercli -d db_uni -u dba,dba

loadercli> import db catalog instream file 'C:\tmp1\20080702a_dbAsc.catalog' ddl

OK

loadercli> import db data instream file 'C:\tmp1\20080702b_dbAsc.data' pages

ERR -25347 Encoding type of source and target database do not match: source = ASCII, target = UNICODE.

loadercli> exit

What is wrong? Is a migration from ascii to unicode to be done somehow else?

Can I migrate a db from 7.5.00.26 to 7.6.03.15 in the same way or should it be done in another way.

It would be greate if you point me to a post etc. where these two migrations are explained in detail.

Thanks in advance - kind regards

Michael

Accepted Solutions (0)

Answers (1)

Answers (1)

markus_doehr2
Active Contributor
0 Kudos

> 2. Set 'columncompression' to 'no'

>

> C:\> dbmcli -d db_uni -u dba,dba param_directput columncompression no

> ERR > -24979,ERR_XPNOTFOUND: parameter not found

the parameter is "USEUNICODECOLUMNCOMPRESSION" - not only "columncompression".

Markus

Former Member
0 Kudos

Hi,

I can neither find "USEUNICODECOLUMNCOMPRESSION" nor "COLUMNCOMPRESSION". Could it be that there do exist from MaxDB version 7.6 on and not in 7.5?

Kind regards,

Michael

The complete parameter list (created by "dbmcli -d db_uni -u dbm,dbm param_directgetall > maxdb_params.txt") is:

OK
KERNELVERSION                    	KERNEL    7.5.0    BUILD 026-123-094-430
INSTANCE_TYPE                    	OLTP
MCOD                             	NO
RESTART_SHUTDOWN                 	MANUAL
_SERVERDB_FOR_SAP                	YES
_UNICODE                         	YES
DEFAULT_CODE                     	ASCII
DATE_TIME_FORMAT                 	INTERNAL
CONTROLUSERID                    	DBM
CONTROLPASSWORD                  	
MAXLOGVOLUMES                    	2
MAXDATAVOLUMES                   	11
LOG_VOLUME_NAME_001              	LOG_001
LOG_VOLUME_TYPE_001              	F
LOG_VOLUME_SIZE_001              	131072
DATA_VOLUME_NAME_0001            	DAT_0001
DATA_VOLUME_TYPE_0001            	F
DATA_VOLUME_SIZE_0001            	262144
DATA_VOLUME_MODE_0001            	NORMAL
DATA_VOLUME_GROUPS               	1
LOG_BACKUP_TO_PIPE               	NO
MAXBACKUPDEVS                    	2
BACKUP_BLOCK_CNT                 	8
LOG_MIRRORED                     	NO
MAXVOLUMES                       	14
_MULT_IO_BLOCK_CNT               	4
_DELAY_LOGWRITER                 	0
LOG_IO_QUEUE                     	50
_RESTART_TIME                    	600
MAXCPU                           	1
MAXUSERTASKS                     	50
_TRANS_RGNS                      	8
_TAB_RGNS                        	8
_OMS_REGIONS                     	0
_OMS_RGNS                        	25
OMS_HEAP_LIMIT                   	0
OMS_HEAP_COUNT                   	1
OMS_HEAP_BLOCKSIZE               	10000
OMS_HEAP_THRESHOLD               	100
OMS_VERS_THRESHOLD               	2097152
HEAP_CHECK_LEVEL                 	0
_ROW_RGNS                        	8
_MIN_SERVER_DESC                 	16
MAXSERVERTASKS                   	21
_MAXTRANS                        	292
MAXLOCKS                         	2920
_LOCK_SUPPLY_BLOCK               	100
DEADLOCK_DETECTION               	4
SESSION_TIMEOUT                  	900
OMS_STREAM_TIMEOUT               	30
REQUEST_TIMEOUT                  	5000
_USE_ASYNC_IO                    	YES
_IOPROCS_PER_DEV                 	1
_IOPROCS_FOR_PRIO                	1
_USE_IOPROCS_ONLY                	NO
_IOPROCS_SWITCH                  	2
LRU_FOR_SCAN                     	NO
_PAGE_SIZE                       	8192
_PACKET_SIZE                     	36864
_MINREPLY_SIZE                   	4096
_MBLOCK_DATA_SIZE                	32768
_MBLOCK_QUAL_SIZE                	16384
_MBLOCK_STACK_SIZE               	16384
_MBLOCK_STRAT_SIZE               	8192
_WORKSTACK_SIZE                  	8192
_WORKDATA_SIZE                   	8192
_CAT_CACHE_MINSIZE               	262144
CAT_CACHE_SUPPLY                 	3264
INIT_ALLOCATORSIZE               	221184
ALLOW_MULTIPLE_SERVERTASK_UKTS   	NO
_TASKCLUSTER_01                  	tw;al;ut;2000*sv,100*bup;10*ev,10*gc;
_TASKCLUSTER_02                  	ti,100*dw;30000*us;
_TASKCLUSTER_03                  	compress
_MP_RGN_QUEUE                    	YES
_MP_RGN_DIRTY_READ               	NO
_MP_RGN_BUSY_WAIT                	NO
_MP_DISP_LOOPS                   	1
_MP_DISP_PRIO                    	NO
XP_MP_RGN_LOOP                   	0
MP_RGN_LOOP                      	0
_MP_RGN_PRIO                     	NO
MAXRGN_REQUEST                   	300
_PRIO_BASE_U2U                   	100
_PRIO_BASE_IOC                   	80
_PRIO_BASE_RAV                   	80
_PRIO_BASE_REX                   	40
_PRIO_BASE_COM                   	10
_PRIO_FACTOR                     	80
_DELAY_COMMIT                    	NO
_SVP_1_CONV_FLUSH                	NO
_MAXGARBAGE_COLL                 	0
_MAXTASK_STACK                   	1024
MAX_SERVERTASK_STACK             	100
MAX_SPECIALTASK_STACK            	100
_DW_IO_AREA_SIZE                 	50
_DW_IO_AREA_FLUSH                	50
FBM_VOLUME_COMPRESSION           	50
FBM_VOLUME_BALANCE               	10
_FBM_LOW_IO_RATE                 	10
CACHE_SIZE                       	10000
_DW_LRU_TAIL_FLUSH               	25
XP_DATA_CACHE_RGNS               	0
_DATA_CACHE_RGNS                 	8
XP_CONVERTER_REGIONS             	0
CONVERTER_REGIONS                	8
XP_MAXPAGER                      	0
MAXPAGER                         	11
SEQUENCE_CACHE                   	1
_IDXFILE_LIST_SIZE               	2048
_SERVER_DESC_CACHE               	74
_SERVER_CMD_CACHE                	22
VOLUMENO_BIT_COUNT               	8
OPTIM_MAX_MERGE                  	500
OPTIM_INV_ONLY                   	YES
OPTIM_CACHE                      	NO
OPTIM_JOIN_FETCH                 	0
JOIN_SEARCH_LEVEL                	0
JOIN_MAXTAB_LEVEL4               	16
JOIN_MAXTAB_LEVEL9               	5
_READAHEAD_BLOBS                 	25
RUNDIRECTORY                     	E:\_mp\u_v_dbs\EVERW_T3
_KERNELDIAGFILE                  	knldiag
KERNELDIAGSIZE                   	800
_EVENTFILE                       	knldiag.evt
_EVENTSIZE                       	0
_MAXEVENTTASKS                   	1
_MAXEVENTS                       	100
_KERNELTRACEFILE                 	knltrace
TRACE_PAGES_TI                   	2
TRACE_PAGES_GC                   	0
TRACE_PAGES_LW                   	5
TRACE_PAGES_PG                   	3
TRACE_PAGES_US                   	10
TRACE_PAGES_UT                   	5
TRACE_PAGES_SV                   	5
TRACE_PAGES_EV                   	2
TRACE_PAGES_BUP                  	0
KERNELTRACESIZE                  	653
EXTERNAL_DUMP_REQUEST            	NO
_AK_DUMP_ALLOWED                 	YES
_KERNELDUMPFILE                  	knldump
_RTEDUMPFILE                     	rtedump
_UTILITY_PROTFILE                	dbm.utl
UTILITY_PROTSIZE                 	100
_BACKUP_HISTFILE                 	dbm.knl
_BACKUP_MED_DEF                  	dbm.mdf
_MAX_MESSAGE_FILES               	0
_EVENT_ALIVE_CYCLE               	0
_SHAREDDYNDATA                   	10280
_SHAREDDYNPOOL                   	3658
USE_MEM_ENHANCE                  	NO
MEM_ENHANCE_LIMIT                	0
__PARAM_CHANGED___               	0
__PARAM_VERIFIED__               	2008-07-02 21:10:19
DIAG_HISTORY_NUM                 	2
DIAG_HISTORY_PATH                	E:\_mp\u_v_dbs\EVERW_T3\DIAGHISTORY
_DIAG_SEM                        	1
SHOW_MAX_STACK_USE               	NO
LOG_SEGMENT_SIZE                 	43690
SUPPRESS_CORE                    	YES
FORMATTING_MODE                  	PARALLEL
FORMAT_DATAVOLUME                	YES
HIRES_TIMER_TYPE                 	CPU
LOAD_BALANCING_CHK               	0
LOAD_BALANCING_DIF               	10
LOAD_BALANCING_EQ                	5
HS_STORAGE_DLL                   	libhsscopy
HS_SYNC_INTERVAL                 	50
USE_OPEN_DIRECT                  	NO
SYMBOL_DEMANGLING                	NO
EXPAND_COM_TRACE                 	NO
OPTIMIZE_OPERATOR_JOIN_COSTFUNC  	YES
OPTIMIZE_JOIN_PARALLEL_SERVERS   	0
OPTIMIZE_JOIN_OPERATOR_SORT      	YES
OPTIMIZE_JOIN_OUTER              	YES
JOIN_OPERATOR_IMPLEMENTATION     	YES
JOIN_TABLEBUFFER                 	128
OPTIMIZE_FETCH_REVERSE           	YES
SET_VOLUME_LOCK                  	YES
SHAREDSQL                        	NO
SHAREDSQL_EXPECTEDSTATEMENTCOUNT 	1500
SHAREDSQL_COMMANDCACHESIZE       	32768
MEMORY_ALLOCATION_LIMIT          	0
USE_SYSTEM_PAGE_CACHE            	YES
USE_COROUTINES                   	YES
MIN_RETENTION_TIME               	60
MAX_RETENTION_TIME               	480
MAX_SINGLE_HASHTABLE_SIZE        	512
MAX_HASHTABLE_MEMORY             	5120
HASHED_RESULTSET                 	NO
HASHED_RESULTSET_CACHESIZE       	262144
AUTO_RECREATE_BAD_INDEXES        	NO
LOCAL_REDO_LOG_BUFFER_SIZE       	0
FORBID_LOAD_BALANCING            	NO

former_member229109
Active Contributor
0 Kudos

Hello Michael,

-> If you are SAP customer please review the SAP notes::

852597 Error -2000 "Row too long" during CREATE or ALTER TABLE

962019 Heterogeneous system copy of a MaxDB Content Server Storage

-> You wrote, that you have the database version 7.5.00.26, the parameter columncompression is not exist in your database version.

New parameter 'COLUMNCOMPRESSION', with which all columns are stored (except for key/long columns) in variable length(PTS1132544) was added as of database version 7.6.00.10.

-> A new kernel parameter 'USEUNICODECOLUMNCOMPRESSION' has been added as of version 7.6.03.04. PTS 1147552, www.sapdb.org/webpts

-> Please let us know if you are SAP customer & could follow the SAP notes::

962019 Heterogeneous system copy of a MaxDB Content Server Storage

1014782 FAQ: MaxDB system copy

Thank you and best regards, Natalia Khlopina

Edited by: Natalia Khlopina on Jul 2, 2008 7:23 PM

Former Member
0 Kudos

Hi Natalia,

thanks for your infos about the two db params.

I am not a SAP customer. I use MaxDB with "Community license (free of charge)" (see http://maxdb.sap.com/license/MaxDB_Community_License_2007.pdf).

I have following question: Is it right that the community license does only cover the MaxDB versions 7.5 and 7.6 but not also newer ones? Is "7.6.03.15" also covered by the community license? I am irritated because on the one hand "[SAP network wiki|https://www.sdn.sap.com/irj/sdn/wiki?path=/display/maxdb/faq]" explicitly mentions "7.6.00" as the last "open source product" (see quote below), while on the other hand "SAP MaxDB Downloads - Community Editions" only offers "7.6.03.15" and mentions the above mentioned community license.

Quote from "[SAP network wiki|https://www.sdn.sap.com/irj/sdn/wiki?path=/display/maxdb/faq]":

What is the relationship of MaxDB and SAPDB and open source? MaxDB versions 7.5 and 7.6.00 are available as an open source product for several years now. These continue to stay in open source. This is maintained and driven by an open source community. We are proud of past SAP contributions to open source and the continued contributions of the community to maintain it. But SAP is not making any more active contributions to it. Over the last two years, we have continued to evolve the SAP MaxDB technology and have made very significant investments and innovations in this technology. These innovations are not in open source. Thus the latest MaxDB product is not open source.

Is there any change to get a login for the "SAP service marketplace" as a non-SAP custumer?

To migrate a db from MaxDB 7.5.00.26 ascii to a MaxDB 7.6.03.15 unicode, would you advice

- to do it directly (7.5 ascii >> 7.6 unicode) or

- to go 7.5 ascii >> 7.5 unicode >> 7.6 unicode or

- to go 7.5 ascii >> 7.6 ascii >> 7.6 unicode?

How?

Thanks in advance -- Kind regards

Michael

Edited by: Michael Poetzsch on Jul 3, 2008 11:45 AM

lbreddemann
Active Contributor
0 Kudos

Hi Michael,

I'll try to answer your questions one by one as the thread becomes confusing.

Question 1

You're primary question was "How can we convert a MaxDB database from ASCII to UNICODE".

As you've surely already realized the data encoding scheme of your database had nothing to do with the problem in your other thread .

Therefore the question is: why would you want to convert your database to UNICODE? Do you need to store UNICODE data in it?

The parameter _UNICODE does not impact the storage of your application data but only defines what encoding scheme is used for the database catalog information.

For your application data the encoding scheme is defined with the creation of columns. You can specify ASCII or UNICODE encoding for character types for each column separately.

If you omit this (as it is often done) than the db parameter DEFAULT_CODE defines which encoding will be used.

So if you want to 'convert' the catalog only, you can export your application data, create a db instance with _UNICODE = YES and import your data there.

Alternatively you can upgrade your 7.5 instance to 7.6 - as far as I remember the catalog will be rebuild in UNICODE than (you may check this yourself as I did not have a 7.5er instance with _UNICODE = NO at hand).

To convert your application data to unicode you will have to change the column definitions of your tables.

For the conversion ASCII -> UNICODE this can be done via ALTER TABLE... but if this affects several columns/tables it may be better to unload the data again (e.g. as CSV), recreate all tables with UNICODE columns and import the data again.

Be aware that also your client application needs to be able to handle the UNICODE data.

-


Question 2

"Is it right that the community license does only cover the MaxDB versions 7.5 and 7.6 but not also newer ones? Is "7.6.03.15"; also covered by the community license?

MaxDB 7.6 is the same major release as 7.6.03.15 - so yes, it is covered by the community license. Simply speaking: Whenever you can download a MaxDB version from the SDN Download sections it's covered by the community license.

-


Question 3

Is there any change to get a login for the "SAP service marketplace" as a non-SAP customer?

No, this is not possible. The Service Marketplace is the place for SAP customers to access licensed software. Software they pay a lot of money for. Just like the tools for system copies.

Apart from the fact that it would be quite difficult to explain to the paying customers why others get the software too without paying anything for it, it would barely help you.

The SAP software for system copies is created to perform copies/conversions of SAP systems only. It relies heavily on the internal structures of SAP NetWeaver databases.

This software will basically not work without a SAP NetWeaver database.

-


Question 4

"What is the best way to perform the conversion from ASCII to UNICODE?"

Well, you don't need to upgrade the database instance (although I would recommend this).

How you can actually perform the conversion - see my reply to Question 1.

If I were in your place I would perform the upgrade to 7.6. Then double check if I really need UNICODE for my application data. Only if this is really the case I would take the effort to convert all the data.

-


Ok, I hope that answered your questions.

Best regards,

Lars

Former Member
0 Kudos

Hi Lars,

thanks for all your infos.

We need to migrate to unicode because we need to cope with chars which are not part of an ansi font. It's a huge effort for just some chars :-).

I have still the following problems to face

Kind regards,

Michael

As for MaxDB 7.5.00.26, I still don't manage to import data from an ascii db. To test it, I created 3 DBs with DB Manager 7.5 (GUI) with the default values given in the wizard, otherwise mentioned

- TEST_A01 - a ascii DB as "source", added a table under DBA with a id and a char field and 2 datasets

- TEST_A02 - a ascii DB as "destination"

- TEST_U01 - a unicode DB as "destination" (_UNICODE=YES)

1. First, I exported catalog and data from TEST_A01 (see below) with loadercli (7.5)

2. Then, I imported catalog and data into

(a) the empty ascii DB (TEST_A02) and

(b) the empty unicode DB (TEST_U01).

Export works fine, Import works fine for the ascii db. However, when importing into the unicode db leads an error occurs (see below). Is it not possible to import ascii data into a unicode db with MaxDB version 7.5.00.26 and loadercli 7.5.00.26? Or do I have to use other loadercli commands/options? Would the loader which comes together with DB Studio 7.7 be able to do it?

1. TEST_A01 (Export):

> loadercli -d TEST_A01 -u dba,dba

loadercli> export db catalog outstream file 'E:\_mp\u_v_dbs\x_loaderData\20080704-01_testA01.catalog'

OK

loadercli> export db data outstream file 'E:\_mp\u_v_dbs\x_loaderData\20080704-01_testA01.data' pages

OK

loadercli> exit

2. (a) TEST_A02 (Import nach Ascii-DB):

> loadercli -d TEST_A02 -u dba,dba

loadercli> import db catalog instream file 'E:\_mp\u_v_dbs\x_loaderData\20080704-01_testA01.catalog'

OK

loadercli> import db data instream file 'E:\_mp\u_v_dbs\x_loaderData\20080704-01_testA01.data' pages

OK

loadercli> exit

2. (b) TEST_U01 (Import nach Unicode-DB):

> loadercli -d TEST_U01 -u dba,dba

loadercli> import db catalog instream file 'E:\_mp\u_v_dbs\x_loaderData\20080704-01_testA01.catalog'

OK

loadercli> import db data instream file 'E:\_mp\u_v_dbs\x_loaderData\20080704-01_testA01.data' pages

ERR -25347 Encoding type of source and target database do not match: source = ASCII, target = UNICODE.

loadercli> exit

Besides this I tried the following combination:

As for MaxDB 7.6.03.15 plus loader included in DB Studio 7.7, I found out the following:

I can import a self-created scheme of an ascii db not only into another ascii db but also into a unicode db!!!

However, another problem seems to occur: the data of the tables which are created in the DBA scheme can neither be imported into the DBA scheme of another db nor into another self-created scheme of another db. (Details and examples see below under item "Export+Import: DB1-ascii/DBA-scheme ...")

It is important for me to treat self-created tables which are part of DBA scheme, because MaxDB 7.5 does not support schemes ("create scheme myscheme" cannot be called in SqlStudio 7.5.00.18) and thus the db I have to convert to unicode (and hopefully also to 7.6) has all its tables in the DBA scheme.

How can I transfer the data of the self-created tables located in the DBA scheme into another db?

Here the details and examples: "TESTSCHEMA" is an scheme I created in addition to the predefined scheme "DBA". TEST_A01, TEST_A02 are ascii DB instance and TEST_U01 is a unicode DB instance of MaxDB version 7.6.03.15, TEST_A01 contains a example table in scheme DBA and another one in TESTSCHEMA.

- "Export+Import": DB1-ascii/TESTSCHEMA >> DB2-ascii/TESTSCHEMA: works fine

DB "TEST_A01": Export of scheme

Loader job 1> Started export of schema TESTSCHEMA to E:\_mp\u_v_dbs\Test_A01_export_testschema

Loader job 1> SET TRANSACTION SIZE 100000

Loader job 1> EXPORT SCHEMA "TESTSCHEMA" CATALOG OUTSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_testschema\TESTSCHEMA.CATALOG' DATA OUTSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_testschema\TESTSCHEMA.DATA' RECORDS PACKAGE OUTSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_testschema\TESTSCHEMA.EXPORT'

Loader job 1> Total number of tables (definition) exported: 1; Total number of tables (data) exported: 1 (excluded: 0, failed: 0)

Loader job 1> Finished successfully

DB "TEST_A02": Import of scheme

Loader job 2> Started import of schema TESTSCHEMA from E:\_mp\u_v_dbs\Test_A01_export_testschema

Loader job 2> CREATE SCHEMA "TESTSCHEMA"

Loader job 2> SET TRANSACTION SIZE 10000

Loader job 2> IMPORT SCHEMA "TESTSCHEMA" REJECT DUPLICATES CATALOG INSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_testschema\TESTSCHEMA.CATALOG' DATA INSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_testschema\TESTSCHEMA.DATA' RECORDS

Loader job 2> Total number of tables (definition) imported: 1; Total number of tables (data) imported: 1 (excluded: 0, failed: 0)

Loader job 2> Finished successfully

- Export+Import: DB1-ascii/TESTSCHEMA >> DB2-unicode/TESTSCHEMA: works fine as well !!!

- Export+Import: DB1-ascii/DBA-scheme >> (a) DB2-ascii/DBA-scheme or (b) DB2-ascii/TESTSCHEMA2: did not work

DB "TEST_A01": Export of DBA scheme

Loader job 3> Started export of schema DBA to E:\_mp\u_v_dbs\Test_A01_export_dba

Loader job 3> SET TRANSACTION SIZE 100000

Loader job 3> EXPORT SCHEMA "DBA" CATALOG OUTSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_dba\DBA.CATALOG' DATA OUTSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_dba\DBA.DATA' RECORDS PACKAGE OUTSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_dba\DBA.EXPORT'

Loader job 3> Total number of tables (definition) exported: 2; Total number of tables (data) exported: 2 (excluded: 0, failed: 0)

Loader job 3> Finished successfully

... >> (a) DB2-ascii/DBA-Schema: Import of DBA scheme

DB "TEST_A02"

Loader job 4> Started import of schema DBA from E:\_mp\u_v_dbs\Test_A01_export_dba

Loader job 4> SET TRANSACTION SIZE 10000

Loader job 4> IMPORT SCHEMA "DBA" REJECT DUPLICATES CATALOG INSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_dba\DBA.CATALOG' DATA INSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_dba\DBA.DATA' RECORDS

failed with error code -25392; see DB Studio Event Log for details

Loader job 4> Finished with errors

... >> (b) DB2-ascii/TESTSCHEMA2: Import of DBA scheme

DB "TEST_A02"

Loader job 5> Started import of schema TESTSCHEMA2 from E:\_mp\u_v_dbs\Test_A02_export_dba

Loader job 5> SET TRANSACTION SIZE 10000

Loader job 5> IMPORT SCHEMA "TESTSCHEMA2" REJECT DUPLICATES MAP SCHEMA "DBA" TO "TESTSCHEMA2" CATALOG INSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_dba\DBA.CATALOG' DATA INSTREAM FILE 'E:\_mp\u_v_dbs\Test_A01_export_dba\DBA.DATA' RECORDS

failed with error code -25392; see DB Studio Event Log for details

Loader job 5> Finished with errors

Edited by: Michael Poetzsch on Jul 4, 2008 8:42 PM

lbreddemann
Active Contributor
0 Kudos

> We need to migrate to unicode because we need to cope with chars which are not part of an ansi font. It's a huge effort for just some chars :-).

Yes, it is

> As for MaxDB 7.5.00.26, I still don't manage to import data from an ascii db. To test it, I created 3 DBs with DB Manager 7.5 (GUI) with the default values given in the wizard, otherwise mentioned

> - TEST_A01 - a ascii DB as "source", added a table under DBA with a id and a char field and 2 datasets

> - TEST_A02 - a ascii DB as "destination"

> - TEST_U01 - a unicode DB as "destination" (_UNICODE=YES)

> 1. First, I exported catalog and data from TEST_A01 (see below) with loadercli (7.5)

> 2. Then, I imported catalog and data into

> (a) the empty ascii DB (TEST_A02) and

> (b) the empty unicode DB (TEST_U01).

>

> Export works fine, Import works fine for the ascii db. However, when importing into the unicode db leads an error occurs (see below). Is it not possible to import ascii data into a unicode db with MaxDB version 7.5.00.26 and loadercli 7.5.00.26? Or do I have to use other loadercli commands/options? Would the loader which comes together with DB Studio 7.7 be able to do it?

I propose you just use the loadercli from 7.6.03. Build 15 (available here in SDN) for your loading activities.

> 1. TEST_A01 (Export):

>

> loadercli -d TEST_A01 -u dba,dba

> loadercli> export db catalog outstream file 'E:\_mp\u_v_dbs\x_loaderData\20080704-01_testA01.catalog'

> OK

> loadercli> export db data outstream file 'E:\_mp\u_v_dbs\x_loaderData\20080704-01_testA01.data' pages

> OK

> loadercli> exit

Hmm... you want to convert data here... so better don't use pages as intermediate format.

Use (as written above) CSV data as this will be handled via INSERT statements.

> It is important for me to treat self-created tables which are part of DBA scheme, because MaxDB 7.5 does not support schemes ("create scheme myscheme" cannot be called in SqlStudio 7.5.00.18) and thus the db I have to convert to unicode (and hopefully also to 7.6) has all its tables in the DBA scheme.

Well, 7.5 does not support the explicit handling of schemas. Anyhow - create a user and create the tables as this user and you've your schema (just like Oracle does it).

You should never create your tables in the dba schema...

> How can I transfer the data of the self-created tables located in the DBA scheme into another db?

Hmm... I would try to export the whole db and edit the DDL script.

Remove everything that does not belong to your application data.

Afterwards the import should work.

Best regards,

Lars

Former Member
0 Kudos

Hi Lars,

thanks a lot for all your hints.

For now, I've stopped the examinations, because the migration is hold off on and will come up in a few months again. The deadlines sometimes vary quite abruptly.

Kind regards,

Michael