Giovanni,
Start with your search results page and work back from there. Decide what 
fields you want to display in a results page, then plan for your Solr document 
to contain all these fields. Now you will need a program to ingest the data 
from whatever database, and create documents for Solr. This program can be 
written in Python, Java, or .. whatever. Or you can use DIH. Cheers-- Rick

On July 24, 2017 5:49:38 AM EDT, Giovanni De Stefano <giova...@servisoft.be> 
wrote:
>Hello guys,
>
>I need to index content coming from different sources (db, filesystems,
>…).
>Those sources share most fields, only a few are specific to the source.
>Content coming from different sources changes at different rates.
>Some sources will generate hundreds of thousands of documents, some
>other one million, other a few thousands.
>
>The end user should search and “operate” on “generic" documents
>(faceting, etc) regardless of the source.
>
>From the management (e.g. import) and search relevance (e.g. analysis,
>relevance, etc) point of view, what is considered “best practice”:
>
>one core for all sources and import through different entities
>one core per source and search across multiple cores
>something else
>
>?
>
>It would be great if you can share your experience or point me to some
>articles.
>
>Thank you in advance!

-- 
Sorry for being brief. Alternate email is rickleir at yahoo dot com 

Reply via email to