cancel
Showing results for 
Search instead for 
Did you mean: 

query help needed for querybuilder to use with lcm cli

Former Member
0 Kudos

Hi,

I had set up several queries to run with the lcm cli in order to back up personal folders, inboxes, etc. to lcmbiar files to use as backups.  I have seen a few posts that are similar, but I have a specific question/concern.

I just recently had to reference one of these back ups only to find it was incomplete.  Does the query used by the lcm cli also only pull the first 1000 rows? Is there a way to change this limit somwhere?

Also, since when importing this lcmbiar file for something 'generic' like 'all personal folders', pulls in WAY too much stuff, is there a better way to limit this? I am open to suggestions, but it would almost be better if I could create individual lcmbiar output files on a per user basis.  This way, when/if I need to restore someone's personal folder contents, for example, I could find them by username and import just that lcmbiar file, as opposed to all 3000 of our users.  I am not quite sure how to accomplish this...

Currently, with my limited windows scripting knowledge, I have set up a bat script to run each morning, that creates a 'runtime' properties file from a template, such that the lcmbiar file gets named uniquely for that day and its content.  Then I call the lcm_cli using the proper command.  The query within the properties file is currently very straightforward - select * from CI_INFOOBJECTS WHERE SI_ANCESTOR = 18.

To do what I want to do...

1) I'd first need a current list of usernames in a text file, that could be read (?) in and parsed to single out each user (remember we are talking about 3000) - not sure the best way to get this.

2) Then instead of just updating the the lcmbiar file name with a unique name as I do currently, I would also update the query (which would be different altogether):  SELECT * from CI_INFOOBJECTS where SI_OWNER = '<username>' AND SI_ANCESTOR = 18.

In theory, that would grab everything owned by that user in their personal folder - right? and write it to its own lcmbiar file to a location I specify.

I just think chunking something like this is more effective and BO has no built in back up capability that already does this.  We are on BO 4.0 SP7 right now, move to 4.1 SP4 over the summer.

Any thoughts on this would be much appreciated.

thanks,
Missy

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Additional question to my original post - and maybe this is better suited for it's own topic, but I am giving it a go here first...

Is there a command that can be run (possibly scheduled) to clean up the LCM > Promotion Jobs folder? Something to automatically delete the Query_Administrator_* jobs created by the lcm_cli after X number of days?

I was cleaning this up manually before, but that was when I was only creating 5 separate jobs every morning... now the back up of these personal folders aone is going to create at least 1000 jobs (thanks to that modified query!).

I figure if there's a way to kill sessions via a command that gets scheduled... there *should* be a way to delete these jobs!  Any help on this one too, would be much appreciated!!

thanks,

missy

joshua_kuhn
Employee
Employee
0 Kudos

If its the actual job object you want to delete, then you'd have to use the SDK to script it out.  If its the instances for a particular job then you could use instance 'limits' option to set a max number of instances for a scheduled job and this could work at the 'promotion jobs' folder level.  Instance limits is really the only type of cleanup built into the current product and as the name implies, its only for instances, not the schedulable object itself. 

As for cleaning up the actual job, I don't know of a script that already exists to do this but it wouldn't be too difficult to create.  You would basically just loop through the infoobjects that sit under the parent folder 'Promotion Jobs' and comparing the SI_UPDATE_TS property to see if the time you specify exceeds the age.  And issue a "infoobjects.delete(infoobject)" for each infoobject that meets your criteria, and committing it at the end.  Just be careful to exclude any object in the cms query you don't want deleted.

Former Member
0 Kudos

I honestly haven't done anything with the SDK as of yet, so for now, I'll keep this in mind! Thanks for the info!

-missy

joshua_kuhn
Employee
Employee
0 Kudos

Try importing this biar file. The program file it imports will delete anything returned by the following query:


select top 5000 si_name from ci_infoobjects where si_kind = "LCMJob" and si_instance = 0 and si_name like '%Query_%'


https://share.sap.com/a:gtpal1/MyAttachments/3a35266a-1f64-46c5-adcf-a81066d54828/

Former Member
0 Kudos

That was fantastic! Worked perfectly – thank you so much!

-missy

Answers (2)

Answers (2)

Former Member
0 Kudos

So apparently, this query:

SELECT * from CI_INFOOBJECTS where SI_OWNER ='<username>' AND SI_ANCESTOR = 18

Does not actually grab everything in a particular user's personal folder.  It only grabs files where that user is set as the owner. Does anyone know the proper query to grab EVERYTHING in a particular user's folder? I may post this question to one of the other querybuilder posts too, if no bites here.

Also, in case anyone else is interested... I am making good progress with this and testing my script now.  I'll post it once I have verified it... I started down the path of using vbscript, but ultimately switched to powershell and have been chugging along.

Thanks,

missy

Former Member
0 Kudos


It should work.

Former Member
0 Kudos

well, it doesn't I have a user that should return 54 objects total (including their own named folder).  however, the query only returns 51 objects.

it looks like it is not pulling in objects created by other users that my user has saved to their personal folder.


it would also be great if i could *not* include the ~WebIntelligence folder and its contents in my query too.

joshua_kuhn
Employee
Employee
0 Kudos

The SI_ANCESTOR is used to return all descendants of a particular si_id.  Originally you were using SI_ANCESTOR = 18 which should return all child objects of the top level 'user folders' object.  In other words, it should return all user folders in the system.  If you want a particular user folder, such as the personal folder for the 'administrator' account and also filter out the '~webintelligence' subfolder, then you'd specify the folder id to be used with si_ancestor ... something like this.

select * from ci_infoobjects where si_ancestor = 850 and si_name != '~Webintelligence'

If you're doing this for every user folder in the system, then i'd suppose you'd want the child folder id's for root folder 18.  This could be returned with a query like:

select si_name, si_id from ci_infoobjects where si_parentid = 18

Once you have these values, create a for loop to iterate through each infoobject and replace the 850 in the first query with the si_id returned in the second.

Former Member
0 Kudos

Thanks for the reply.  That helped, but it actually helped me realize that I wanted something a little different! I ended up going with this:

select top 5000 * from ci_infoobjects where si_parentid = 18 and SI_NAME NOT IN ('Guest', 'SMAdmin', 'Administrator', 'QaaWSServletPrincipal') AND SI_CHILDREN !=0 AND SI_NAME = SI_OWNER ORDER BY SI_NAME

Essentially, that gives me the complete list of users with personal folders that have actual content.  I think I am still probably getting a few ~WebIntelligence folders, but I realize now what I was really going for was to only get the list of users that have logged in and/or saved something to their personal folder; otherwise, I don't care if they get backed up or not.

Thanks again!

-missy

Former Member
0 Kudos

Just wanted to pass along that SAP Support pointed me to KBA 1969259 which had some good example queries in it (they were helping me with a concern I had over the lcmbiar file output, not with query design).  I was able to tweak one of the sample queries in this KBA to give me more of what I was after...

SELECT TOP 10000 static, relationships, SI_PARENT_FOLDER_CUID, SI_OWNER, SI_PATH FROM CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS WHERE (DESCENDENTS ("si_name='Folder Hierarchy'","si_name='<username>'"))

This exports inboxes, personal folders, categories, and roles, which is more than I was after, but still necessary to back up.. so in a way, it is actually better because I have one lcmbiar file per user - contains all their 'personal' objects.

So between narrowing down my set of users to only those who actually have saved things to their personal folder and now having a query that actually returns what I expect it to return, along with the help below for a job to clean up these excessive amounts of promotion jobs I am now creating... I am all set!

Hopefully this can help someone else too!

Thanks,

missy

Former Member
0 Kudos


If you want to automate the backup, LCM method is not a right choice for you. Backing up contents in LCM biar file is not recommended method. You use tradtional backup tools like CMS DB backup and file store backup.

If you want to backup handful of users using biar file, you can do it using the script you have. But not for 3000 users.

Regarding the limit - you can add in query, top5000 like that

Former Member
0 Kudos

We have CMS DB backups and filestore backups, but that doesn't help me easily (or quickly) when a user says, 'oops, I deleted this folder and it had 20 reports in it, can you get that back?'.  I should have a way of doing that without having to run through a restore process for the whole environment.  That is what you are referencing, correct?

Thanks for the TOP 5000 suggestion.  That did work to bring back more than 1000 - I didn't think to even try that since running normally, defaults to only retrieving 1000 records.

What I would still really like to do is create an lcmbiar file for EACH user (automatically)... so that I do not have to restore all 3000 users just because of 1 user's mistake.  it just so happens we have 3000 users in our system. I don't really see why it matters whether i want to do that for a few users or a few thousand users.

Thanks,

Missy