Right... You know, if some of your data needs to updated frequently, but other is updated once per year, and is really massive dataset, then maybe splitting it up into separate cores? Since you mentioned that you can't get the raw data again, you could just duplicate your existing index by doing a filesytem copy. Leave that alone so you don't update it and lose your data, and start a new core that you can update and ignore the fact is has all the website data in it. And tie the two cores data sets together outside of Solr.
Eric On Thu, Aug 27, 2009 at 1:46 PM, Paul Tomblin<ptomb...@xcski.com> wrote: > On Thu, Aug 27, 2009 at 1:27 PM, Eric > Pugh<ep...@opensourceconnections.com> wrote: >> You can just query Solr, find the records that you want (including all >> the website data). Update them, and then send the entire record back. >> > > Correct me if I'm wrong, but I think you'd end up losing the fields > that are indexed but not stored. > > > -- > http://www.linkedin.com/in/paultomblin >