It’s been nearly a decade now and this is still an issue… Micro$oft engineers 
must be making jokes about this. Editing a 5MB text file should not be a 
problem.
Seriously though, my intuition is that it can be narrowed down to the system 
displaying Unicode characters as their double byte code in one character space. 
Maybe, the approach should be to use one single generic “U” in a small 
rectangle as the only graphical representation of unknown characters. I doubt 
it is of vital importance to display the exact Unicode value of special 
characters. Users who need this kind of information probably use specialize 
tools already.
As for me, I gave up. Now I use Bless Hex Editor when it comes to text files 
containing blobs and such.

As an example I attached a mysqldump file containing a single row insert
with about 3MB blob data. Such a file is impossible to edit using gedit
3.18.3 in Ubuntu 16.04 with a 6 core CPU, 12GB RAM and a SSD drive.
gedit just hogs an entire core and slowly eats up RAM (250MB after 5
minutes of processing and it keeps on rising).

** Attachment added: "example: SQL insert with blob"
   
https://bugs.launchpad.net/ubuntu/+source/gedit/+bug/156201/+attachment/4801632/+files/insert_blob.sql

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/156201

Title:
  gedit handles opening big files badly

To manage notifications about this bug go to:
https://bugs.launchpad.net/gedit/+bug/156201/+subscriptions

-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to