I can see no reason to keep separate web sites information in the same index. If it's not being served to a website at all, why have data from another website in 'accidental' proximity to it? Someday, a coder WILL make a mistake, or a library upgrade will allow access.
Best at least sort data and access to it on easy to define borders like web sites. Dennis Gearon Signature Warning ---------------- It is always a good idea to learn from your own mistakes. It is usually a better idea to learn from others’ mistakes, so you do not have to make them yourself. from 'http://blogs.techrepublic.com.com/security/?p=4501&tag=nl.e036' EARTH has a Right To Life, otherwise we all die. ----- Original Message ---- From: Jos Janssen <j...@websdesign.nl> To: solr-user@lucene.apache.org Sent: Tue, November 23, 2010 4:35:09 AM Subject: Re: SOLR and secure content Hi everyone, This is how we think we should set it up. Situation: - Multiple websites indexed on 1 solr server - Results should be seperated for each website - Search results should be filtered on group access Solution i think is possible with solr: - Solr server should only be accesed through API which we will write in PHP. - Solr server authentication wil be defined through IP adres on server side and username and password will be send through API for each different website. - Extra document fields in Solr server will contain: 1. Website Hash to identify and filter results fo each different website (Website authentication) 2. list of groups who can access the document (Group authentication) When making a query these fields should be required. Is it possible to configure handlers on the solr server so that these field are required whith each type of query? So for adding documents, deleting and querying? Am i correct? Any further advice is welcome. regard, Jos -- View this message in context: http://lucene.472066.n3.nabble.com/SOLR-and-secure-content-tp1945028p1953071.html Sent from the Solr - User mailing list archive at Nabble.com.