641770 – Fedora-14-Beta-x86_64-Live boot failures no root device found sleeping forever


I have Linux Mint Cinamon 20.2 installed in an old SONY VAIO netbook VPCM11M1E.

It was working great, but today after I applied an upgrade recommended by the upgrade tool, it has provided a message saying a reboot was required. Which I did by selecting restart in the Cinamon GUI.

It was the last time I was able to use my computer :-(

It now boots into the GRUB command line, with the prompt grub>.

grub> ls
(hd0) (hd0,msdos5) (hd0,msdos1)

grub> set root=(hd0,msdos5)

grub> ls /boot

grub> linux /boot/vmlinuz-5.4.0-80-generic

grub> initrd /boot/initrd.img-5.4.0-80-generic

No root device specified. Boot arguments must include a root= parameter

Busybox v1.30.1 (Ubuntu 1:1.30.1-4ubuntu6.3) built-in shell (ash)
Enter ‘help´ for a list of built-in commands.

Going back into grub> after a reboot, I can’t see /dev/hd* nor /dev/sd* on the root device, so I don´t know what to type on the linux boot command.

Being an old notebook, this is a standard boot (the BIOS does not support UEFI). It is currently configured for USB first, then local disk (no USBs connected).

Not sure what to do next, would anyone be able to help?

Last edited by LockBot on Wed Dec 28, 2022 7:16 am, edited 4 times in total.

Reason: Topic automatically closed 6 months after creation. New replies are no longer allowed.

Description of problem:
used Fedora LiveUSB Creator
and  Fedora-14-Beta-x86_64-Live.iso
to make a bootable USB.

Ordinary boot appears to fail, with keyboard becoming partially unresponsive
(Caps lock unresponsive, Cntl-Alt-Delete works for reboot)

Simple graphics boot has the 3 color progress bar advance most of the way to the right. But then screen blanks with message: 

    no root device found 
    Boot has failed, sleeping forever

How do I get troubleshoot this system and get it to boot?

The last kernel to boot successfully on this hardware is:
Linux amd.home #1 SMP Thu May 6 18:09:49 UTC 2010 x86_64 x86_64 x86_64 GNU/Linux

2010-10-11 19:40:39 UTC

changing the boot options to

root=live:LABEL=FEDORA     (from the USB's UUID)

does not seem to help.

2010-10-11 20:22:57 UTC

Low graphics boot without  quiet rhgb

proceeds all the way to 

Starting HAL daemon
retrigger failed udev events
Adding udev persistent rules
Enabling BLuetooth devices:
Starting sendmail:
Starting sm-client:
Starting abrt daemon: eth0: no IPv6 routers present

after a pause, the screen goes blank and the keyboard is unresponsive.

2010-10-11 20:25:06 UTC

(In reply to comment #2)

This was using a CD burnt with Fedora-14-Beta-x86_64-Live.iso

2010-10-11 21:12:51 UTC

holding shift down and entering 'linux0' at boot also fails. 
(Text output scrolls past then screen goes blank and keyboard is unresponsive.)

2010-10-24 04:59:42 UTC

How did you make the usb? In Fedora 13? 
Did you use Fedora LiveUSBCreator version ?

I ask because I think I tried this without success. Then I burned the CD but that didn't help either.

2010-10-24 05:25:47 UTC

I used liveusb-creator-3.9.2-1.fc12.noarch (note that, for some reason, the version reported by the tool is different:
  $ liveusb-creator --version

Make sure you specify the right /dev/sdXn device and partition to point to the USB key.

2010-11-04 18:26:05 UTC

I've been happy with RedHat/Fedora for many years. 

in the absence of advice from experts, I'd recommend:
Plan A: Try the F14 DVD or CD install or live CD.

Plan B: Hold off on F14 for now. Install with F13 which works, and use yum upgrade to keep current until F15 comes out.

Warning: I found the later F13 kernels troublesome for my videocards, so if you have a system which boots before but not after upgrade to 2.6.34 or 2.6.35 kernels, keep the F13original kernel. But 2.6.33 and 2.6.36++ work for me.

Unfortunately rumors are that 2.6.36 won't be in the F14 updates.

2010-11-04 18:38:03 UTC

#8 (Matteo Settenvini) and #9 (Phil V): Can you comment on whether the workaround in comment #5 is working for you? Any error message to report?

"In my case, editing the GRUB command line, replacing root=live:UUID... by
root=live:/dev/sdb1 (which is the usb partition where the image is) is a
successful workaround."

2010-11-04 20:47:14 UTC

#10: I was finally able to boot it in low-graphics mode (unrelated issue with my ATI card, I think - but it's strange because using the opensource ati drivers in Ubuntu don't give me any problem), and specifying root=/dev/sr0.
It was a little bit difficult to guess, anyway.

2010-11-18 11:54:46 UTC

Same problem. Created a live-usb with live-usb-creator and F14-64 live cd on an F14 system. Then tried to use the live-usb to install to a 5 harddrive desktop with a one year old Gigabyte motherboard. Got the:
"no root device found 
    Boot has failed, sleeping forever"
a few seconds into the boot.

#5: "In my case, editing the GRUB command line, replacing root=live:UUID... by
root=live:/dev/sdb1 (which is the usb partition where the image is) is a
successful workaround." solved this for me, with the modification that the live-usb was on sdf1 for me.

2010-12-03 10:28:57 UTC

Got the same problem on a new HP laptop so it seems to be hardware independent.

Out of curiosity, is this related to the live-cd (meaning Fedora is stuck with it until that iso is deprecated) or can it be live-usb-creator?

2011-03-03 19:45:57 UTC

Was this problem resolved? I'm seeing this same problem with current (fully updated as of 3rd of March 2011) Fedora 14 running livecd-creator generating f14 livecd image..

2011-03-03 22:12:09 UTC

I did not find a way to boot Fedora 14 successfully.

I am able to boot gfx_test_week_20110221_x86-64.iso but: 
only with the low quality graphics 'vesa' driver.
on the other hand, I reported in bug 679674 that booting with
'nomodeset drm.debug=0x04' results in:

No root device found

No root device found

Dropping to debug shell.

sh: can't access tty; job control turned off

I've got the same error with pm-test-day-20110324.iso

2011-04-29 17:34:45 UTC

In my case the problem is buggy virtual cdrom drive firmware, ie. bad "READ TOC" implementation, so udev fails to identify the CD and mount root.

udev version from el6 (udev-147-2.35.el6) works OK, and udev from f14 fails. So clearly something got broken in udev. Earlier versions were able to workaround the issue of buggy cdrom drive.

Patch should be added to udev to workaround the issue. 
More info from: https://bugzilla.redhat.com/show_bug.cgi?id=681999

I get "No root device found", "sleeping forever" trying to boot Fedora 14 kernel/initramfs, version

Disabled all boot devices at the BIOS level, other than the disk. (Happens to be Intel 82801 SATA RAID Controller.)

Able to boot with Fedora 13 kernel/initramfs, and everything else Fedora 14.

Upgraded from Fedora 13 to Fedora 14 using preupgrade utility.

Any additional info needed?

Fedora End Of Life

Дополнительно:  How to Add a Root Certificate to the Java Truststore on Mac OS 10.15 Catalina

2012-08-16 19:15:00 UTC

This message is a notice that Fedora 14 is now at end of life. Fedora 
has stopped maintaining and issuing updates for Fedora 14. It is 
Fedora's policy to close all bug reports from releases that are no 
longer maintained.  At this time, all open bugs with a Fedora 'version'
of '14' have been closed as WONTFIX.

(Please note: Our normal process is to give advanced warning of this 
occurring, but we forgot to do that. A thousand apologies.)

Package Maintainer: If you wish for this bug to remain open because you
plan to fix it in a currently maintained version, feel free to reopen 
this bug and simply change the 'version' to a later Fedora version.

Bug Reporter: Thank you for reporting this issue and we are sorry that 
we were unable to fix it before Fedora 14 reached end of life. If you 
would still like to see this bug fixed and are able to reproduce it 
against a later version of Fedora, you are encouraged to click on 
"Clone This Bug" (top right of this page) and open it against that 
version of Fedora.

Although we aim to fix as many bugs as possible during every release's 
lifetime, sometimes those efforts are overtaken by events.  Often a 
more recent Fedora release includes newer upstream software that fixes 
bugs or makes them obsolete.

The process we are following is described here: 

I read everything i could find about my question but nothing helps :(

I tried to make ZFS bootable system using two different SSD’s for now cause i can’t find second SSD with same size. Planning to buy two similar SSD’s in future but now i have only one.

Дополнительно:  APC_INDEX_MISMATCH win32kfull.sys при печати | Вторая жизнь Айтишника

Ok, i rebooted and tried another info from here: https://forum.proxmox.com/threads/stuck-at-initramfs.56158/post-258736
«(initramfs) zpool import -R / rpool » (i see path like / here and rpool as parameter, let’s try)
«The ZFS modules are not loaded.»
«Try running ‘/sbin/modprobe zfs’ as root to load them.» (ok, let’s do this)
«(initramfs) /sbin/modprobe zfs»
«(initramfs) zpool import -R / rpool»
«cannot mount ‘/’: directory is not empty»

So first one is not helped and second one maybe a typo? Anyway i’m stuck here, please help :(
I beleive i configured UEFI boot cause there is «EFI Only» boot type in my BIOS.

Stoiko Ivanov

If I understand you correctly you installed on one SSD (ZFS RAID0) and then tried attaching a second SSD as mirror?

* Did installing with one SSD work? — could you boot afterwards?
* if yes — which boot-loader was used by the installer (for UEFI+ZFS PVE uses systemd-boot, else grub — you should see the difference when booting
(grub has a blue screen, systemd-boot black)

* please paste/screenshot the output of `cat /proc/cmdline` when you reach the initramfs

Yes, i used ZFS RAID0 at one (smallest) SSD first, installed Proxmox and tried to boot from one SSD.

1. Install seems ok, i cannot boot afterwards from one SSD.
2. Looks like it’s systemd-boot cause i didn’t saw any blue screens and also «cat /proc/cmdline» points at EFI loader

3. Paste form «cat /proc/cmdine»:
«initrd=\EFI\proxmox\5.3.10-1-pve\initrd.img-5.3.10-1-pve BOOT_IMAGE=/boot/linux26 ro ramdisk_size=16777216 rw quiet splash=silent»

Stoiko Ivanov

Any messages during the installation? — did you run the installer in Debug mode?

Stoiko Ivanov

did you continue after taking the screenshot? — any messages in the final debug-shell before the reboot?

Yes, but nothing interesting, DHCP discover, reboot, etc.
Nothing like error or strange behaviour.

Stoiko Ivanov

hmm — did you unplug the USB-key before booting into the freshly installed PVE?

I know i’m looking as a newbie but i’m familiar with linux and IT. I’m just trying to fix this approach like a whole day so i’m feeling bit confused and writing my answers like a newbie.

I just tried another physical disk for installation, same error :(

Last edited: Jan 27, 2020

Stoiko Ivanov

in any case the kernel-commandline looks odd — I just installed a system locally with zfs on root:

cat /proc/cmdline 
initrd=\EFI\proxmox\5.3.10-1-pve\initrd.img-5.3.10-1-pve root=ZFS=rpool/ROOT/pve-1 boot=zfs

this is how it looks like when it works — the BOOT_IMAGE=/boot/linux26 you pasted looks wrong and this might be the reason why it’s not working for you

I will reconfigure my test server, install PVE on live disks and immediately return to you with answer!
Anyway much thanks for your support and fast answers :)

«sorry — did not want to imply that you’re unfamiliar with Linux!»
English is not my native language so i think that i need to say sorry! I didn’t wanted to force you to excuse man :)

Last edited: Jan 27, 2020

Nope, still unlucky, still can’t work with ZFS. Thought it was broken SSD but i was wrong :(
I tried another disks, only one disk connected, different combinations of BIOS SATA/RAID/EFI/ settings, etc.
Tried to make ZFS as intended by dev’s, i mean install on 2 similar disks, same error every time.

I’m starting to be sure that my motherboard (GA-Z68AP-D3 (rev. 1.0) ) is buggy somewhere :(

Btw, i forgot to mention before that standart install with ext4 fs are working without problems.

They have like 8Gb difference but i don’t think it’s a problem. I can’t init RAID 0 even on one disk and i’m sure that ZFS can do this.

My mobo support 2 slots of SATA3 but doing it using additional Intel chip. So i can say that SATA3 on my board (where SSD disks is) is internal and external at the same time :D. Anyway i tried to connect SSD’s to another slots, no luck :(

Дополнительно:  Ноутбук не выключается - черный экран | Вторая жизнь Айтишника

Today at work (and after work) i will try to check if USB stick are prepared correctly using VMware lab and also old Laptop.

Stoiko Ivanov

hmm — maybe there’s a BIOS-update available for the mainboard? (sometimes these do help with UEFI boot problems)

Already updated to latest F8 BIOS, didn’t helped

Okay, i tested my USB stick using VMware home lab, here is the info i got:
1. VMware can’t boot from USB using BIOS boot method so i tested only UEFI boot.

2. When i started installation i noticed that PVE know that it boots from UEFI:
Аннотация 2020-01-28 191740.png
When i tried to install PVE yesterday on my bare-metal machine i didn’t saw this message for sure. I selected EFI boot in BIOS before but installer anyway using BIOS boot every time.

3. When it was installed i can see this options on load:
Аннотация 2020-01-28 191822.png
When i tried to install PVE yesterday i could see only first option.

4. PVE successfully installed on ZFS RAID 0 using one virtual drive:
Аннотация 2020-01-28 194418.png
So now i’m sure that my USB stick are prepared correctly using Rufus. Or at least UEFI mode.

Next time i will try to install PVE on my old laptop and will give you new info guys :)

Hello again!
So, what i’ve understood for these days:
1. My problem caused by my motherboard for sure. It can’t boot to UEFI devices, my BIOS just don’t support this.
2. Also i can’t install PVE on ZFS using old BIOS boot method.
3. The problem is not in the USB device or not in SSD disks.

Ok, not a big problem, i will try another setup in future but i still have one main question:
I can’t install PVE on ZFS using old BIOS boot method. Is this intended or something goes wrong with GRUB?
From different sources (wiki, forum, google) i know that i should have such an opportunity.

Or i missed something and latest PVE on ZFS are working only from UEFI?
If i’m right and i must install PVE even using BIOS boot, what possibly goes wrong with my setup? Any new ideas?

Anyway, thank you for trying to help guys, especially Stoiko ;)

Just to validate this, I’m having the same issue. I’m trying to install a root ZFS RaidZ-1 on x3 120GB SATA3 SSDs. The boot fails with «no root device found» and i’m thrown to busybox

Gigabyte SKT-AM3 78LMT-USB3 / AMD 6300 6 Core sh1theap / 32GB DDR3 RAM / x3 SATA Sandisk 120GB SSDs

  • zfs-error.jpg

Last edited: Feb 9, 2020

Even using new MB+CPU+RAM i can reproduce it with this steps:
1. Burn USB-key with Rufus using any of 2 methods: GPT+DD or MBR+DD
2. Choose «UEFI only (no csm)» boot in BIOS
3. Install PVE on ZFS RAID
4. It will try to boot with UEFI using systemd-boot (black one) and then fail with «no root device» error

I had successfully installed my PVE on ZFS RAID only using this steps:
1. Burn USB-key with Rufus using MBR+DD method with 2.02-pve GRUB
2. Choose «UEFI plus Legacy (csm)» boot in BIOS
3. Install PVE on ZFS RAID
4. Boot with Legacy method from any of ZFS RAID disks using GRUB (blue one)
5. Works!

So, i still don’t know what was the problem but you can try to do this:
1. Update BIOS to latest
2. Try to burn USB-key with all methods one by one: GPT+DD, MBR+DD+PVE GRUB 2.02-pve, MBR+DD+Latest GRUB (Rufus will ask you about it)
3. Try to install PVE by booting from USB-key with two methods one by one: UEFI, Legacy
4. Try to boot to freshly installed PVE with two methods one by one: UEFI, Legacy

Yeah, it looks like stupid iteration over all variables and it will not intended to fix the problem but it can lead you to some working combination and maybe to new info that will help community to fix this error :)
I found working one, good luck :)

Оцените статью
Master Hi-technology
Добавить комментарий