With MySQL it depends on the character set you are creating the table with.
for example with mysql 5.0.45:

mysql> create table foo ( x varchar(999), primary key (x)) character set 
= 'latin1';
Query OK, 0 rows affected (0.00 sec)

mysql> create table foo2 ( x varchar(999), primary key (x)) character 
set = 'utf8';
ERROR 1071 (42000): Specified key was too long; max key length is 999 bytes

mysql> create table foo2 ( x varchar(255), primary key (x)) character 
set = 'utf8';
Query OK, 0 rows affected (0.01 sec)


So.. if you can live without the benefit of UTF8 you can achieve your 
result.

personally I think you should look at your DB-design, as indexes on 
large fields are expensive in terms of space used and time to retrieve 
results. you could possibly index the MD5 of the field (32 bytes), and 
then search for duplicates on that key. it might be faster and cheaper 
on the disk.

regards
Ian

Dj Gilcrease wrote:
> The max length of a varchar field is Database dependent
>
> MySQL it is 255
> MSSQL it is 2^31-1 (2gb)
> PostgreSQL it is ~ 2^20-1 (1gb)
>
> MySQL 4.1 and greater supposedly changes any varchar or char field
> with a max length or greater to 255 to a TEXT field which can hold ~
> 1gb
>
> Dj Gilcrease
> OpenRPG Developer
> ~~http://www.openrpg.com
>
> >
>
>   


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to django-developers@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to