cancel
Showing results for 
Search instead for 
Did you mean: 

Data Migration_LSMW

Former Member
0 Kudos

hi all,

need information on data migration and possible methods of LSMW

Thanks

Swapna

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

hi

Can a give a Search on the Topic "Data Migration" & "LSMW" in the forum for valuable information,

<b>Pls Do NOT Post UR Mail-Id, Lets all follow some Rules</b>

Data Migration Life Cycle

Overview : Data Migration Life Cycle

Data Migration

This document aims to outline the typical processes involved in a data migration.

Data migration is the moving and transforming of data from legacy to target database systems. This includes one to one and one to many mapping and movement of static and transactional data. Migration also relates to the physical extraction and transmission of data between legacy and target hardware platforms.

ISO 9001 / TickIT accredited

The fundamental aims of certification are quality achievement and improvement and the delivery of customer satisfaction.

The ISO and TickIT Standards are adhered to throughout all stages of the migration process.

• Customer Requirements

• Dependencies

• Analysis

• Iterations

• Data Cleanse

• Post Implementation

• Proposal

• Project Management

• Development

• Quality Assurance

• Implementation

Customer Requirements

The first stage is the contact from the customer asking us to tender for a data migration project. The invitation to tender will typically include the Scope /

Requirements and Business Rules:

&#61607; Legacy and Target - Databases / Hardware / Software

&#61607; Timeframes - Start and Finish

&#61607; Milestones

&#61607; Location

&#61607; Data Volumes

Dependencies

Environmental Dependencies

&#61607; Connectivity - remote or on-site

&#61607; Development and Testing Infrastructure - hardware, software, databases, applications and desktop configuration

Support Dependencies

&#61607; Training (legacy & target applications) - particularly for an in-house test team

&#61607; Business Analysts -provide expert knowledge on both legacy and target systems

&#61607; Operations - Hardware / Software / Database Analysts - facilitate system housekeeping when necessary

&#61607; Business Contacts

&#61607; User Acceptance Testers - chosen by the business

&#61607; Business Support for data cleanse

Data Dependencies

&#61607; Translation Tables - translates legacy parameters to target parameters

&#61607; Static Data / Parameters / Seed Data (target parameters)

&#61607; Business Rules - migration selection criteria (e.g. number of months history)

&#61607; Entity Relationship Diagrams / Transfer Dataset / Schemas (legacy & target)

&#61607; Sign Off / User Acceptance criteria - within agreed tolerance limits

&#61607; Data Dictionary

Analysis

Gap Analysis

Identifying where differences in the functionalities of the target system and legacy system mean that data may be left behind or alternatively generating default data for the new system where nothing comparable exists on legacy.

Liaison with the business is vital in this phase as mission critical data cannot be allowed to be left behind, it is usual to consult with the relevant business process leader or Subject Matter Expert (SME). Often it is the case that this process ends up as a compromise between:

&#61607; Pulling the necessary data out of the legacy system to meet the new systems functionality

&#61607; Pushing certain data into the new system from legacy to enable the continuity of certain ad hoc or custom in-house processes to continue.

Data mapping

This is the process of mapping data from the legacy to target database schemas taking into account any reformatting needed. This would normally include the derivation of translation tables used to transform parametric data. It may be the case at this point that the seed data, or static data, for the new system needs generating and here again tight integration and consultation with the business is a must.

Translation Tables

Mapping Legacy Parameters to Target Parameters

Specifications

These designs are produced to enable the developer to create the Extract, Transform and Load (ETL) modules. The output from the gap analysis and data mapping are used to drive the design process. Any constraints imposed by platforms, operating systems, programming languages, timescales etc should be referenced at this stage, as should any dependencies that this module will have on other such modules in the migration as a whole; failure to do this may result in the specifications being flawed.

There are generally two forms of migration specification: Functional (e.g. Premise migration strategy) Detailed Design (e.g. Premise data mapping document)

Built into the migration process at the specification level are steps to reconcile the migrated data at predetermined points during the migration. These checks verify that no data has been lost or gained during each step of an iteration and enable any anomalies to be spotted early and their cause ascertained with minimal loss of time.

Usually written independently from the migration, the specifications for the reconciliation programs used to validate the end-to-end migration process are designed once the target data has been mapped and is more or less static. These routines count like-for-like entities on the legacy system and target system and ensure that the correct volumes of data from legacy have migrated successfully to the target and thus build business confidence.

Iterations

These are the execution of the migration process, which may or may not include new cuts of legacy data.

These facilitate:

&#61607; Collation of migration process timings (extraction, transmission, transformation and load).

&#61607; The refinement of the migration code i.e. increase data volume and decrease exceptions through:

&#61607; Continual identification of data cleanse issues

&#61607; Confirmation of parameter settings and parameter translations

&#61607; Identification of any migration merge issues

&#61607; Reconciliation

From our experience the majority of the data will conform to the migration rules and as such take a minimal effort to migrate ("80/20 rule"). The remaining data, however, is often highly complex with many anomalies and deviations and so will take up the majority of the development time.

Data Cuts

&#61607; Extracts of data taken from the legacy and target systems. This can be a complex task where the migration is from multiple legacy systems and it is important that the data is synchronised across all systems at the time the cuts are taken (e.g. end of day processes complete).

&#61607; Subsets / selective cuts - Depending upon business rules and migration strategy the extracted data may need to be split before transfer.

Freeze

Prior to any iteration, Parameters, Translation Tables and Code should be frozen to provide a stable platform for the iteration.

Data Cleanse

This activity is required to ensure that legacy system data conforms to the rules of data migration. The activities include manual or automatic updates to legacy data. This is an ongoing activity, as while the legacy systems are active there is the potential to reintroduce data cleanse issues.

Identified by

• Data Mapping

• Eyeballing

• Reconciliation

• File Integrities

Common Areas

&#61607; Address Formats

&#61607; Titles (e.g. mrs, Mrs, MRS, first name)

&#61607; Invalid characters

&#61607; Duplicate Data

&#61607; Free Format to parameter field

Cleansing Strategy

&#61607; Legacy - Pre Migration

&#61607; During migration (not advised as this makes reconciliation very difficult)

&#61607; Target - Post Migration (either manual or via data fix)

&#61607; Ad Hoc Reporting - Ongoing

Post Implementation

Support

For an agreed period after implementation certain key members of the migration team will be available to the business to support them in the first stages of using the new system. Typically this will involve analysis of any irregularities that may have arisen through dirty data or otherwise and where necessary writing data fixes for them.

Post Implementation fixes

Post Implementation Data Fixes are programs that are executed post migration to fix data that was either migrated in an 'unclean' state or migrated with known errors. These will typically take the form of SQL scripts.

Proposal

This is a response to the invitation to tender, which comprises the following:

Migration Strategy

&#61607; Migration development models are based on an iterative approach.

&#61607; Multiple Legacy / Targets - any migration may transform data from one or more legacy databases to one or more targets

&#61607; Scope - Redwood definition / understanding of customer requirements, inclusions and exclusions

The data may be migrated in several ways, depending on data volumes and timescales:

&#61607; All at once (big bang)

&#61607; In logical blocks (chunking, e.g. by franchise)

&#61607; Pilot - A pre-test or trial run for the purpose of proving the migration process, live applications and business processes before implementing on a larger scale.

&#61607; Catch Up - To minimise downtime only business critical data is migrated, leaving historical data to be migrated at a later stage.

&#61607; Post Migration / Parallel Runs - Both pre and post migration systems remain active and are compared after a period of time to ensure the new systems are working as expected.

Milestones can include:

&#61607; Completion of specifications / mappings

&#61607; Successful 1st iteration

&#61607; Completion of an agreed number of iterations

&#61607; Delivery to User Acceptance Testing team

&#61607; Successful Dress Rehearsal

&#61607; Go Live

Roles and Responsibilities

Data Migration Project Manager/Team Lead is responsible for:

&#61607; Redwood Systems Limited project management

&#61607; Change Control

&#61607; Solution Design

&#61607; Quality

&#61607; Reporting

&#61607; Issues Management

Data Migration Analyst is responsible for:

&#61607; Gap Analysis

&#61607; Data Analysis & Mapping

&#61607; Data migration program specifications

&#61607; Extraction software design

&#61607; Exception reporting software design

Data Migration Developers are responsible for:

&#61607; Migration

&#61607; Integrity

&#61607; Reconciliation (note these are independently developed)

&#61607; Migration Execution and Control

Testers/Quality Assurance team is responsible for:

&#61607; Test approach

&#61607; Test scripts

&#61607; Test cases

&#61607; Integrity software design

&#61607; Reconciliation software design

OtherRoles:

• Operational and Database Administration support for source/target systems.

• Parameter Definition and Parameter Translation team

• Legacy system Business Analysts

• Target system Business Analysts

• Data Cleansing Team

• Testing Team

Project Management

Project Plan

&#61607; Milestones and Timescales

&#61607; Resources

&#61607; Individual Roles and Responsibilities

&#61607; Contingency

Communication

It is important to have good communication channels with the project manager and business analysts. Important considerations include the need to agree the location, method and format for regular meetings/contact to discuss progress, resources and communicate any problems or incidents, which may impact the ability of others to perform their duty. These could take the form of weekly conference calls, progress reports or attending on site

project meetings.

Change Control

&#61607; Scope Change Requests - a stringent change control mechanism needs to be in place to handle any deviations and creeping scope from the original project requirements.

&#61607; Version Control - all documents and code shall be version controlled.

Issue Management

&#61607; Internal issue management- as a result of Gap analysis, Data Mapping, Iterations Output (i.e. reconciliation and file integrity or as a result of eyeballing)

&#61607; External issue management - Load to Target problems and as a result of User Acceptance Testing

&#61607; Mechanism - examples:

&#61607; Test Director

&#61607; Bugzilla

&#61607; Excel

&#61607; Access

&#61607; TracNotes

Development

Extracts / Loads

&#61607; Depending on the migration strategy, extract routines shall be written to derive the legacy data required

&#61607; Transfer data from Legacy and/or Target to interim migration environment via FTP, Tape, CSV, D/B object copy, ODBC, API

&#61607; Transfer data from interim migration environment to target

Migration (transform)

There are a number of potential approaches to a Data Migration:

&#61607; Use a middleware tool (e.g. ETI, Powermart). This extracts data from the legacy system, manipulates it and pushes it to the target system. These "4th Generation" approaches are less flexible and often less efficient than bespoke coding, resulting in longer migrations and less control over the data migrated.

&#61607; The Data Migration processes are individually coded to be run on a source, an interim or target platform. The data is extracted from the legacy platform to the interim / target platform, where the code is used to manipulate the legacy data into the target system format. The great advantage of this approach is that it can encompass any migration manipulation that may be required in the most efficient, effective way and retain the utmost control. Where there is critical / sensitive data migrated this approach is desirable.

&#61607; Use a target system 'File Load Utility', if one exists. This usually requires the use of one of the above processes to populate a pre-defined Target Database. A load and validate facility will then push valid data to the target system.

&#61607; Use an application's data conversion/upgrade facility, where available.

Reconciliation

Independent end-to-end comparisons of data content to create the necessary level of business confidence

&#61607; Bespoke code is written to extract required total figures for each of the areas from the legacy, interim and target databases. These figures will be totalled and broken down into business areas and segments that are of relevant interest, so that they can be compared to each other. Where differences do occur, investigation will then instruct us to alter the migration code or if there are reasonable mitigating factors.

&#61607; Spreadsheets are created to report figures to all levels of management to verify that the process is working and build confidence in the process.

Referential File Integrities

Depending on the constraints of the interim/target database, data may be checked to ascertain and validate its quality. There may be certain categories of dirty data that should be disallowed e.g. duplicate data, null values, data that does not match to a parameter table or an incompatible combination of data in separate fields as proscribed by the analyst. Scripts are written that run automatically after each iteration of the migration. A report is then generated to itemise the non-compatible data.

Quality Assurance

Reconciliation

&#61607; Horizontal reconciliation (number on legacy = number on interim = number on target) and Vertical reconciliation (categorisation counts (i.e. Address counts by region = total addresses) and across systems).

&#61607; Figures at all stages (legacy, interim, target) to provide checkpoints.

File Integrities

Scripts that identify and report the following for each table:

&#61607; Referential Integrity - check values against target master and parameter files.

&#61607; Data Constraints

&#61607; Duplicate Data

Translation Table Validation

Run after new cut of data or new version of translation tables, two stages:

&#61607; Verifies that all legacy data accounted for in "From" translation

&#61607; Verifies that all "To" translations exist in target parameter data

Eyeballing

Comparison of legacy and target applications

&#61607; Scenario Testing -Legacy to target system verification that data has been migrated correctly for certain customers chosen by the business who's circumstances fall into categories (e.g. inclusion and exclusion Business Rule categories, data volumes etc.)

&#61607; Regression Testing - testing known problem areas

&#61607; Spot Testing - a random spot check on migrated data

&#61607; Independent Team - the eyeballing is generally carried out by a dedicated testing team rather than the migration team

UAT

This is the customer based User Acceptance Test of the migrated data which will form part of the Customer Signoff

Implementation

Freeze

A code and parameter freeze occurs in the run up to the dress rehearsal. Any problems post freeze are run as post freeze fixes.

Dress Rehearsal

Dress rehearsals are intended to mobilise the resources that will be required to support a cutover in the production environment. The primary aim of a dress rehearsal is to identify the risks and issues associated with the implementation plan. It will execute all the steps necessary to execute a successful 'go live' migration.

Through the execution of a dress rehearsal all the go live checkpoints will be properly managed and executed and if required, the appropriate escalation routes taken.

Go Live window (typical migration)

&#61607; Legacy system 'end of business day' closedown

&#61607; Legacy system data extractions

&#61607; Legacy system data transmissions

&#61607; Readiness checks

&#61607; Migration Execution

&#61607; Reconciliation

&#61607; Integrity checking

&#61607; Transfer load to Target

&#61607; User Acceptance testing

&#61607; Reconciliation

&#61607; Acceptance and GO Live

===================

LSMW: Refer to the links below, can get useful info (Screen Shots for various different methods of LSMW)

Step-By-Step Guide for LSMW using ALE/IDOC Method (Screen Shots)

http://www.saptechnical.com/Tutorials/LSMW/IDocMethod/IDocMethod1.htm

Using Bapi in LSMW (Screen Shots)

http://www.saptechnical.com/Tutorials/LSMW/BAPIinLSMW/BL1.htm

Uploading Material Master data using BAPI method in LSMW (Screen Shots)

http://www.saptechnical.com/Tutorials/LSMW/MMBAPI/Page1.htm

Step-by-Step Guide for using LSMW to Update Customer Master Records(Screen Shots)

http://www.saptechnical.com/Tutorials/LSMW/Recording/Recording.htm

Uploading Material master data using recording method of LSMW(Screen Shots)

http://www.saptechnical.com/Tutorials/LSMW/MMRecording/Page1.htm

Step-by-Step Guide for using LSMW to Update Customer Master Records(Screen Shots) Batch Input method

Uploading Material master data using Direct input method

http://www.saptechnical.com/Tutorials/LSMW/MMDIM/page1.htm

Steps to copy LSMW from one client to another

http://www.saptechnical.com/Tutorials/LSMW/CopyLSMW/CL.htm

Modifying BAPI to fit custom requirements in LSMW

http://www.saptechnical.com/Tutorials/LSMW/BAPIModify/Main.htm

Using Routines and exception handling in LSMW

http://www.saptechnical.com/Tutorials/LSMW/Routines/Page1.htm

Reward if USeful

Thanx & regrads

Naren

Answers (3)

Answers (3)

former_member227476
Active Contributor
0 Kudos

dear swapna

Step-by-Step Guide for Using LSMW to Update Customer Master Records

Business Case:

As a part of reorganization and to better serve the customer needs, you are regrouping many of the customers. In SAP terms, you are changing the Sales Office, Sales Group and Customer Groups for specific Customer Master Records. Typically, you would maintain customer records with transaction XD02 to update ‘Sales View’. You would enter Customer Key (Customer No, Sales Organization, Distribution Channel, and Division) and update relevant fields on Sales View screen.

This document contains Step-by-step instructions to use LSMW to update Customer Master Records. It has two demonstration examples - one using Batch Recording and another using standard SAP Object.

Note! The screenprints in this article are from IDES Release 4.6. They may differ slightly in other versions.

Demo Example 1

LSMW to Update Customer Master Records with Transaction Recording

Call Legacy System Migration Workbench by entering transaction code LSMW. Every conversion task is grouped together as Project / Subproject / Object structure. Create a Project called LSMW_DEMO and a Subproject as CUSTOMERS and Object as CUST_REC as shown in Figure 1.

Figure 1 Conversion Task with Project, Subproject and Object

The main screen of LSMW provides wizard-like step-by-step tasks, as shown in Figure 2. To complete your data conversion, you need to execute these steps in sequence. Once a step is executed, the cursor is automatically positioned to the next step.

Note that these steps may look different depending upon your Personal menu settings. You could make step numbers visible by ‘Numbers on’ icon or hidden by ‘Numbers off’ icon. You can execute a step by double-clicking on the row. Toggle icon ‘Doubleclick=Display’ or ‘Doubleclick=Edit’, makes the step in ‘display’ mode or ‘change’ mode.

Figure 2 LSMW Wizard – initial screen

Step 1: Maintain Object attributes

In this example, you will be updating the customer master records with the help of recording a transaction (XD02). Choose radio button Batch Input Recording and click on the recording overview icon to record the R/3 transaction. Enter the Recording name as XD02_REC, the description as Customer Master Updates Recording, and the transaction code as XD02.

Figure 3 Object type ‘Transaction Recording’

The system calls the transaction code XD02 and prompts you to complete the Change

Customer transaction, as shown in Figure 4. Enter the key customer information (I entered customer number 1000, sales organization 1000, distribution channel 01, and division 00) and choose ‘Sales’ view within ‘Sales area data’. Make changes to these three fields (I entered, sales office 1010, sales group 110, and customer group 01) and save the transaction.

Figure 4 Transaction recording for Transaction Code ‘XD02’

Once the transaction is completed, R/3 records the flow of screens and fields and saves the information, as shown in Figure 5.

Figure 5 Transaction recording overview

Note that the fields are populated with default values. The values you entered when you recorded the transaction are set by default.

Note that if you have more fields in the recording than needed, you can remove them by clicking ‘Remove Screen field’ icon.

Observe that the transaction-recording process stores field names in a technical format. By pressing the F1 key on individual screen fields and then pressing the F9 key, the system displays technical names. You then can replace the technical names with descriptive names. Double-click on the field RF02D-KUNNR and enter the name as KUNNR and the description as Customer Account Number and remove the default value. (See Figure 6.)

Figure 6 Field attributes

Similarly, double-click on all other fields with default values and make appropriate changes. Once you have made changes, the recording overview screen looks like what you see in Figure 7.

Figure 7 Transaction Recording Overview – with screen field attributes

Save your changes. When you go back to the initial screen, you will see that the initial screen steps have changed. Since you want to import data via the BDC method, the Direct Input and IDoc-related steps are hidden, as they are not relevant.

Step 2. Maintain Source Structures

Give a name and a description to the source structure (Figure 8).

Figure 8 Source Structure

Step 3. Maintain Source Fields

In this step, you need to list what fields are present in the source structure. The easiest way is to click on ‘Table Maintenance’ icon to enter Fieldname, Type and Length for each field as shown in Figure 9.

Figure 9 Source fields of source Structure

Note that your input file will have four fields as key fields and you need to update three fields in the system.

Step 4: Maintain Structure Relations

Execute a step to ‘Maintain Structure Relations’. (See Figure 10.) Since, there is only one Source and Target Structure, the relationship is defaulted automatically.

Figure 10 Structure Relation

Step 5: Maintain field mapping and conversion rules

Field RF02D-D0310 represents that you chose ‘Sales view’ for the customer Master screen accordingly its value should be set to X. Keep your cursor on field RF02D-D0310 and click on Constant rule icon to choose the constant value of ‘X’.

If your source file already has the field value, you choose rule ‘Source Field’.

Keep cursor on field ‘KUNNR’ and click on ‘Assign Source field’ icon to choose source field CUSTOMER from structure XD02S as shown in Figure 11.

Figure 11 Assign source fields

Similarly, assign ‘Source Field’ rules to the remaining fields.

Once all the fields are mapped, you should have an overview screen as shown in Figure 12.

Figure 12 Field mapping and Conversion rules overview

Step 6: Maintain fixed values, translations, user-defined routines

You can also maintain re-usable translations and user-defined routines, which can be used across conversion tasks. In this case, that step is not required.

Step 7: Specify files

In this step, we define how the layout of the input file is. The input file is a [Tab] delimited with the first row as field names. It is present on my PC (local drive) as C:\XD02.txt. (See Figure 13.)

Figure 13 File attributes

Create an Excel file (Figure 14) with your data and save it as a Tab-delimited text file on your local drive (C:\) and name it XD02.txt.

Figure 14 Source data in Excel file (saved as Tab delimited file)

Step 8: Assign files

Execute step ‘Assign Files’ (Figure 15) and the system automatically defaults the filename to the source structure.

Figure 15 Assign file to Source Structure

Step 9: Read data

In this step, LSMW reads the data (Figure 16) from the source file (from your PC’s local drive). You have the option to read only selected rows and convert data values to Internal format.

Figure 16 Read Data

Step 10: Display read data

This step (Figure 17) is optional. If required, you can review the field contents for the rows of data read.

Figure 17 Display Read Data

Step 11: Convert data

This is the step that actually converts the source data (in source format) to a target format. Based on the conversion rules defined, source fields are mapped to target fields.

Step 12: Display Converted data

Again this is an optional step to view how the source data is converted to internal SAP format (Figure 18).

Figure 18 Display Converted Data

Step 13: Create batch input session

Once the source data is converted in an internal format, you can create a batch session to process updates (Figure 19).

Figure 19 Create Batch Input Session

Step 14: Run Batch Input Session

You can execute the BDC session by Run Batch input session. Executing a batch input session is a standard SM35 transaction for managing BDC sessions. Once you have successfully executed the batch input session, the customer master records are updated in the system. You can confirm this by viewing the customer master records (XD03).

Note! Browsing thru these 14 steps, you may get a feeling that this is a very lengthy and time-consuming activity. However, for the purposes of demonstration, I have made it detailed. Although it looks lengthy, actually it takes hardly few hours from start-to-finish! After playing around with few simple LSMW scripts, you will find it so easy to change and create more complex ones.

Demo Example 2

LSMW to Update Customer Master Records with Standard Object

As an alternative to using ‘Transaction Recording’, you could also use a standard SAP object to update Customer Master Records. Business Object ‘0050’ is already pre-defined in the system with standard Batch Input Interface Program ‘RFBIDE00’.

Create an Object CUST_OBJ within Project as LSMW_DEMO and Subproject as CUSTOMERS as shown in Figure 20.

Figure 20 LSMW Object with Standard SAP Object

Note! For the Demo example 2, I will list only those steps that are different from the first demo example.

Step 1: Maintain Object attributes

You will be updating the customer master records with the help of Standard Batch Input; therefore, choose radio-button Standard Batch/Direct Input as shown in Figure 21. Enter Object ‘0050’ for Customer Master records and default method ‘0000’ and click on Save.

Figure 21 Standard Batch/Direct Input Object Attributes

Step 4: Maintain Structure Relations

Sales view of Customer Master is stored in table KNVV. Accordingly, you need to update structure BKNVV. However, in addition, the Standard Object ‘0050’ also requires updates to BGR00, BKN00 and BKNA1 structures. (If you do not maintain Structure relations for mandatory entries, you might get a message such as ‘Target structure BKNA1 needs a relation to a source structure’.)

Even though you don’t want to update any fields in these structures, you need to create a relationship with source structures. In all, you need to create relationship for four target structures.

Create relationship between source structures XD02S with these target structures with icon ‘Create Relationship’ .

Keep Cursor on these four target structures and click on icon ‘Create Relation’ and structure relations are maintained as shown in Figure 22.

Figure 22 Structure Relation

Step 5: Maintain field mapping and conversion rules

-- Keep your cursor on ‘TCODE’ field and click on ‘Insert Rule’ icon

Figure 23 LSMW Conversion Rules

Choose radio button ‘Constant’ (Figure 23) to enter value ‘XD02’ transaction code.

-- Keep your cursor on field ‘KUNNR’ and click on ‘Assign source field’ icon

Choose source field ‘Customer’ from source structure ‘XD02S’. (See Figure 24.)

Figure 24 Assign Source fields

-- Similarly, choose source fields for Sales Organization, Distribution Channel, and Division. (See Figure 25.)

Figure 25 Field Mapping and Conversion Rules

-- Scroll down to structure BKNVV fields and assign source fields to three fields Sales Office, Sales Group, and Customer Group (Figure 26).

Figure 26 Field Mapping and Conversion Rules

Save and go back to main screen.

Step 12: Display Converted data

When you convert data, LSMW automatically converts into the appropriate structure layouts, as required by Standard program (RFBIDE00). (See Figure 27).

Figure 27 Converted data into multiple structures

Note that if you had only one record in source file, the converted file has four records.

Earlier, creating this input file, so that the standard interface program can read it, was a big nightmare, the primary reason being that it could have multiple record layouts. Even for a simple conversion with one input record, you would have to create this complex file with many record layouts. The advantage of LSMW is that it prepares these multi-layout files automatically.

Step 13: Create batch input session

Once source data is converted in internal format, you can create a BDC session to process the updates (Figures 28 and 29).

Figure 28 Create BDC Session

Figure 29 BDC Session ‘CUST_OBJ’ created

Summary

Once BDC session is processed successfully, SAP updates the customer master records with relevant changes. Review these specific customers (transaction code XD03) and confirm that the changes are correctly reflected in the master records.

rewards if it helps

siva

Former Member
0 Kudos

Hi refer below

LSMW Steps For Data Migration

Example for XD01 (create Customer)

Initially there will be 20 steps but after processing 1 step it will reduced to 14 for session method.

1. TCode : LSMW.

2. Enter Project name, sub project name and object name.

Execute.

3. Maintain object attributes.

Execute

select Batch Input recording

goto->Recording overview

create

recording name.

enter transaction code.

start recording

do recording as per ur choice.

save + back.

enter recording name in lsmw screen.

save + back

Now there will be 14 steps.

2. MAINTAIN SOURCE STRUCTURES.

Here you have to enter the name of internal table.

display change

create

save + back

3. MAINTAIN SOURCE FIELDS.

display change

select structure

source_fields->copy fields.

a dialogue window will come .

select -> from data file

apply source fields

enter No. of fields

length of fields

attach file

save + back

4. MAINTAIN STRUCTURE RELATIONS

display change

save + back

5. MAINTAN FIELD MAPPING & CONVERSION RULE

display change

click on source field, select exact field from structue and enter

repeat these steps for all fields.

save+back

6. MAINTAIN FIXED VALUES, TRANSACTION, USER DEFINED

execute

save + back

7. SPECIFY FILES.

display change

click on legacy data

attah flat file

give description

select tabulatore

enter

save + back

8. ASSIGN FILE

execute

display change

save + back

9. IMPORT DATA.

execute

display change

save + back

10. DISPLAY IMPORTED DATA

enter ok, it willl show records only.

back

11. CONVERT DATA

execute

display change

save + back

12. DISPLAY CONVERTED DATA

execute

display change

save + back

13. CREATE BATCH INPUT SESSION

tick keep batch input folder

F8

back

14. RUN BATCH INPUT SESSION.

sm35 will come

Object name will be shown here

select object & process

Lsmw screen Shoots <a href="http://www.sapbrainsonline.com/TOOLS/LSMW/SAP_LSMW_steps_introduction.html">tep-by-Step Guide for using LSMW to Update Customer Master Records</a>

Former Member
0 Kudos

Hi Swapnalatha

Update your mail id, i have some files on LSMW, i will send it to you.

Bidhu