Playing with #fibrechannel a little more this evening.

Fedora spins are nice. Some of the live ISOs boot just fine over the network. I couldn't get the SoaS version working right, but it did boot.

I also tried every screenshot tool on COSMIC until finally grimshot did the job.

The Plasma mobile environment is very much like a mobile OS. It would be quite nice on a Surface Pro. But it's a real Linux, terminal and all. There are a bunch of apps I've never heard of. I'll have to do an install.

I made very little progress this weekend with #fibrechannel.

There aren't that many distros that can boot the installer over fibre channel. Once I figure out what I want to host, I can at least try booting over USB and installing to the remote disks.

I did pull the 500GB nvme drive from my VisionFive 2 and install it in my target machine on a PCIe card to host the disk images. The sata drive was starting to hang up, like it was wearing out. This should help.

I bought two OM4 cables too.

Today's adventures in #fibrechannel

I installed several flavors of Linux on my homemade SAN (if you can even call it that).

I was also able to get FreeBSD installed up to the point where it will load the bootloader over the network, but when starting the kernel the entire computer restarts.

I'm not sure what I'm missing.

Today's results on #fibrechannel

I have installed three operating systems on fibrechannel-connected storage. Attaching fetches from each.

Conceptually this technology is simple and when it works, it actually solves some problems in my tiny homelab.

Today I went to the store to purchase a USB 3.1 flash drive for these tests and had some success. At some point I had a flash of insight and mapped the installation image file to a LUN.

I spent a couple of hours this afternoon installing Ubuntu to a #fibrechannel connected drive, but I couldn't get it to boot.

I've chosen the correct LUN from the card BIOS settings and set it to boot from the HBA. The only thing I can figure is that it has something to do with EFI.

I know the zones are set correctly because I can see the remote storage when installing and Ubuntu desktop and server both went through the normal setup process.

I must be missing something.

I'm almost certain removing the PCIe video card from my Dell T3500 will allow the #fibrechannel card to get enough IRQs or bus resources to work.

But this is a Xeon without an onboard GPU.

The motherboard has two PCI slots, and I dug up a Matrox Millenium card I've had for about 25 years.

Support for the card was removed from Linux a while back, but Linux doesn't need it, only the BIOS.

So I'm already two assumptions deep and I haven't even done anything.

What are the odds this will work?

Though it's admittedly a little niche, I'm surprised there's no third-party Linux available for the Brocade #fibrechannel switches. #FabricOS is itself Linux for PPC, so I know it's possible.

Today's adventures with #fibrechannel switching. The electronics store had a rs232 cable which worked. I've logged in with the factory default root password.

The management Ethernet port has an address on my subnet now.

I can't log in because the SSH setup is too old for public keys to be exchanged. From the switch side, you grab the public key using scp also too old to connect to my SSH server.

I suppose I could put Debian Wheezy on a VM. But it's probably more secure to keep using serial.

The #qlogic #fibrechannel HBA arrived and I finally have a peer-to-peer connection between two Linux PCs.

The connection speed is 8Gb/s so faster than the SATA connection the target drive is attached to.

What's kind of frustrating is finding instructions on some random forum and not on Ubuntu's website (Ubuntu is the OS of the target machine). Redhat has some details, but some of it is behind a login.

It looks like the #qlogic #fibrechannel boards do target mode on Linux.

So I'll order some and see how it goes.

I don't know why I'm obsessed with this all of a sudden, but I need to get this working.