on 12-07-2005 12:11 PM
hi all
i've created an index and assigned data source for that index.previously it was able to search some documents.for searching all documents i clicked on reindex. after that it is not able to search not even a single document now. so,now i deleted that index and created another index.but now also it is not able to search the documents. in display queues it is showing that no documents got status ok.can anybody plzz suggest me a solution to solve the problem.
regards
gnana
Hi,
documents with more than 10kb were not indexed. The reason for this is, that doc´s smaller than 10kb are automatically transmitted to TREX by the crawler. For documents bigger than 10kb the crawlers only transmit the URI and then the TREX fetches them by itself.
The URL´s that are sent to TREX are normally built by the URL generator using the host entry. If you have a clustered landscape this might not work, because the Firewall, Load-Balancers, WebServers, or external authentication Systems might cause problems when they are directly called by TREX via the normal portal URL.
If you use SSL for the communication between TREX and portal it even gets more complicated.Thus, if you have clustered system, be sure that the communication TREX to Portal can take place without problems. Take a look at the trace files of the TREX preprocessor, this should give you a hint on this.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
hi Kirupanand venkatapathi
how to see the trace files of the TREX preprocessor, and also i dont have any idea abt Clustered landscape and SSL and is any method like clearing cache cozz i've deleted one index and again while creating index giving the deleted index name it is showing error.so can u tell me an exact way to make my search working.
Regards
gnana
Hi Gnana and Rukmani,
I have a similar error and my application log shows the following error:
Failed to create crawler task TestIndex_Public Documents - com.sapportals.wcm.service.xcrawler.XCrawlerException: The SQL statement "SELECT "XCRW_TASK_INDEX" FROM "KMC_XCRW_TASKS" WHERE "XCRW_TASK_ID" = ?" contains the semantics error[s]: type check
We are on EP6 SP 14 patch 0. Do you know if this is a patch issue or something else. Can you please let me know how did you get it resolved.
Thanks,
Mandar
Hi,
are there any documents in the Trex queue and if yes in which state do they are?
Regards,
Achim
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Achim Weigel
Crawlers are processes that retrieve documents and analyze their contents in order to locate additional documents for processing.
Refresh
There are no crawler tasks to display.
the above message is displayed in crawler monitor.
where can i create crawler tasks.while creating index i assigned the standard crawler parameters.
Regards
Gnana
Hi Gnana,
did you attach the folder?
Is it a Search Index (or Search and Classification) index?
Is TREX Monitor showing a valid TREX connection?
Do you index Documents, Folders, or both?
Are there docs of requested kind?
Does TREX recieve any docs?
Did you try to 'flush' the TREX index in order to see immidiate results? Otherwise you have to wait 30 min.
Regards Matthias
Hi Gnana,
did you attach the folder?
<b>yeah i attached the folder</b>
Is it a Search Index (or Search and Classification) index?
<b>it is both search and classification index</b>
Is TREX Monitor showing a valid TREX connection?
<b>yeah in trex monitor it is showing all servers availability</b>Do you index Documents, Folders, or both?
Are there docs of requested kind?
means idid't get u
Does TREX recieve any docs?
<b>how to check this. but in displa queve it is showing tht some index got status ok.</b>Did you try to 'flush' the TREX index in order to see immidiate results? Otherwise you have to wait 30 min.
<b>i tried flush</b>
previously indexing of documents are increasing but now in display queues index is showing but i am able to search some docs which are indexed.
Regards
Gnana
Hi Gnana,
does this mean, that your issue is solved?
You might think of defining a smaller 'Sceduling Time' (default 30 min).
TREX Monitor -> Edit Queue Parameter (of your Index)
Additionally it won't harm to read some documentation:
http://help.sap.com/saphelp_nw04/helpdata/en/e3/92323cb24e11d5993800508b6b8b11/frameset.htm
should be a good starting point.
Don't forget to assign points for helpfull (and possible future) answers.
Regards Matthias
Hi Matthias Roebig-Landau
in display queues it is showing red for my index status.but, day by day the no of docs indexed got status ok is increasig earlier it is 80 now it is 250 but there r some thousands of file in my repository how to make all docs got indexed.but in application log it is showing Xcreawler service error,how to rectify this error. shall i change my scheduling time from 30min to some other time in edit queue parameters. plzz help me out.
awaiting for reply
Regards
Gnana
Hi Gana,
I am not able to view any documents in TREX Monitor.Display Queues show the status as green, but none of the documents are indexed. All the columns in display queue table show 0.
Can you please let me know how could you solve this problem and were able to index the documents.
I have only 3-4 txt files in a folder which I want to index and then search.
Regards,
Vivek
hi Matthias Roebig-Landau
actually my problem is my index is indexexing only 250 files among thusands of files. and in display queues it is showing status red. where as in index admin it its status is showing green,when i refer the application log it is showing "XCrawler Service" error. but my search is working for tht 250 files.so, i want to index all the files now.
hope u understand my problem
awaiting for reply,
Regards
Gnana
the following is showing in
<b>Crawler monitor:</b>
Crawlers are processes that retrieve documents and analyze their contents in order to locate additional documents for processing.
Refresh
There are no crawler tasks to display.
<b>Application Log:</b>
XCrawlerService Failed to create crawler task enteg_019-Solution Manager Official.text - com.sapportals.wcm.service.xcrawler.XCrawlerException: The SQL statement "SELECT "XCRW_TASK_INDEX" FROM "KMC_XCRW_TASKS" WHERE "XCRW_TASK_ID" = ?" contains the semantics error[s]: ty
Failed to create crawler task enteg_000 Knowledge Repository - com.sapportals.wcm.service.xcrawler.XCrawlerException: The SQL statement "SELECT "XCRW_TASK_INDEX" FROM "KMC_XCRW_TASKS" WHERE "XCRW_TASK_ID" = ?" contains the semantics error[s]: type check e
awaiting for u r reply
Gnana
Hi Dirk
s i sm working on Netweaver 04 SR1. and in the folowing path
<b>Support Packages and Patches
Support Packages and Patches" SAP NetWeaver" SAP NETWEAVER" SAP NETWEAVER 04" Entry by Component" Enterprise Portal (EP)</b>
i am getting
KMC NW04 SP09 and in patch level it is showing 0 but where as in
Patch for SP14 of Content Management + Collaboration 6.0 640 patch level is showing 3
so, which i've to download plzz suggest me. is above path is correct or i've to upgrade TREX also.
Regards
Gnana
Hi,
The location of the trace depends on the version of the portal that you are using. If you are using NW04 up to SPS 10, then you can find the trace in the config trace file. This is named config.x.trc (x is a number) and is found under the /serverx/log directory.
In other versions, the trace wll be in the default_trace.x.trc files. These are in the same folder.
REWARD USEFUL ANSWERS
Message was edited by: Kirupanand venkatapathi
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
78 | |
10 | |
9 | |
7 | |
6 | |
6 | |
5 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.