I’ve updated the installation to RC2, the behaviour has changed in vmware fusion to when I installed one of the BETA releases.
The initial boot fails, the system will reboot again and then it succeeds, though more predictable than with the BETAs I’m trying to see the differences in screen recording I made, and I can only see the “Image base” is changed. Load path, load device, bootcurrent, bootinfo path, currdev (disk0p1, EFI)) has not changed. Either something in the loader(s) was changed, or the RC1 to RC2 update placed contents on more “favourable" places on disk that made it possible to boot again. But I do not understand how the loader interacts with the (Vmware) EFI firmware to make sense out of it. > On 10 Mar 2021, at 12:22, Ruben van Staveren <[email protected]> wrote: > > To continue on the subject of UEFI booting weirdness > >> On 9 Mar 2021, at 16:57, Ruben van Staveren <[email protected]> wrote: >> >> If I press escape and end up in VMWare’s UEFI setup screen I can boot from >> any ada*p1 drive and continue as normal. >> Is UEFI with OpenZFS too new, or is this an issue in VMWare? > > I got an off list tip to see whether this was also the case in bhyve, so I > also created the setup in there, using UEFI boot, and no problems even with > the special/log/cache NVMe vdevs attached to the pool. > > So I’m starting to wonder whether the loader / VMware UEFI firmware (??) > interaction is a bug in VMware or an edge case that needs to be supported too. > > Btw, bhyve is looking nice these days! > > Cheers, > Ruben >
signature.asc
Description: Message signed with OpenPGP
