Hello Christian,

On Thu, Jun 23, 2005 at 02:00:37PM +0200, Christian Hammers wrote:
> tags 315524 + moreinfo unreproducible
> thanks
> 
> Hello Jörg
> 
> On 2005-06-23 Joerg Rieger wrote:
> > while running a wikipedia database export as XML export the following
> > error occurs reproducible:
> > 
> > # time mysql -u root -p -X -e "select cur_title, cur_text from cur" 
> > wikidb > wiki.xml
> > Enter password:
> > ERROR 2013 (HY000) at line 1: Lost connection to MySQL server during 
> 
> I've never encountered it and can't reproduce it with an arbitrary
> table.
> 
> Does this problem only occur with a specific table? Does it continue
> after you do an "REPAIR TABLE cur EXTENDED;" on it? If you, could you
> send me a tabledump or a zipped database file so that I can give it
> to the MySQL developers for investigation?
> 
> If you know about any anomalies like special charsets, binary data or
> table formats else than innodb/myisam for this table it could be good to
> know, too.

It's a database dump of the german (de) wikipedia[1] database, like 
a month old or so. Because of that, the size of the database is quite 
huge (roughly 1,8 GB). If you still want the file, I could upload it to 
a server, but that'll take a while :-)

Since they use InnoDB as storage engine a repair table doesn't work:

mysql> REPAIR TABLE cur EXTENDED;
+------------+--------+----------+--------------------------------------
| Table      | Op     | Msg_type | Msg_text                             
+------------+--------+----------+--------------------------------------
| wikidb.cur | repair | note     | The storage engine for the table doesn't 
support repair |
+------------+--------+----------+--------------------------------------
1 row in set (0.00 sec)


I also have a wikinews DB within the same mysql installation, XML 
export works fine with this one, short example:

<?xml version="1.0"?>

<resultset statement="select cur_title, cur_text from cur">
  <row>
          <cur_title>Hauptseite</cur_title>
.
.
.

 
However the size is considerably less than the other wiki DB (only 
about 7,5 MB).

As far as I can recall, I have made a successfull XML dump of that 
big wiki DB on the same machine a while ago. Maybe a feature backport 
from mysql 5? But changelog doesn't state anything like it, so probaply 
not.




[1] http://download.wikimedia.org/


-- 

Reply via email to