2015-04-20

Note to Future Self: Random problems with anaconda installs after a new release

So  I have run into the following problem several times in the past, and never documented it to remember the next time it happens (I seem to do this about once every 3 years..)

Story behind the problem:

Tonight I was starting to reinstall our test cloud system so that msuchy could run his playbooks against the top node and other systems. I have been doing this very regularly, and it is pretty much a


  1. Log into Dell iDrac on the cloud systems
  2. Fire up the remote console.
  3. Click on 'Next Boot' -> 'PXE'
  4. Click on 'Warm Reboot'
  5. Let the PXE menu come up and pick the mode for the particular type of hardware I am rebuilding.
  6. Let it run for 10 minutes. Move to the next system
  7. .... ansible stuff goes here ....
Tonight as I was doing it, it died at 



ValueError: new value non-existant xfs filesystem is not valid as a default fs type

I then used the tmux Control-B 2 to get to the shell and started looking at the log files in /tmp

01:04:30,510 INFO program: Running... modprobe xfs

storage.log has [typed from remote console]

01:04:30,509 DEBUG blivet: trying to set new default fstype to 'xfs'
01:03:30,510 DEBUG blivet:         XFS.supported: supported: True;
01:04:30,521 ERR blivet: Could not load kernel module xfs
01:04:30,521 DEBUG blivet: getFormat('xfs') returning XFS instance with object id 1
01:04:30,523 DEBUG blivet:                    XFS.supported: supported: False ;
01:04:30,521 DEBUG blivet: invalid default fstype: XFS instance (0x7fe6987362d0) objet id 1--
  type = xfs name = xfs status = False
  device = None uuid = None exists = None
  options = defaults supported = False formattable = True resizable = False
  mountpoint = None mountopts = None
  label = None size = 0 B targetSize = 0 B

01:04:30,525 DEBUG blivet:                    XFS.supported: supported: False ;
01:04:30,527 DEBUG blivet:                    XFS.supported: supported: False ;


I then tried a manual insmod of the xfs module and looked in dmesg

[  760.625372] xfs: module verification failed: signature and/or required key missing - tainting kernel
[  760.625484] xfs: disagrees about version of symbol ftrace_raw_output_prep

Aha! That tells me that my kernels don't match. Now since no one else is complaining about not being able to install xfs on EL-7.1 that tells me I have a mismatched kernel somewhere.

Now for a lot of our systems we use https://infrastructure.fedoraproject.org/infra/docs/kickstarts.txt and the grub-boot method to rebuild systems. But as mentioned above, we use PXE for the cloud systems and when we updated to the new RHEL-7.1 images we forgot to update the PXE kernel items. Update the vmlinuz and initrd.img on the PXE server and tada! it all works.

2015-04-10

FLOCK Rochester NY 2015: 3 things you should do...

This is cribbed from Ruth Suehle's email to announce and other lists:


  1. Register! Only 70 people have registered so far. What are you waiting for? https://register.flocktofedora.org/
  2. Submit a talk!  We've extended the deadline to May 2. Get those talk proposals in! https://register.flocktofedora.org/ 
  3. Reserve your hotel room! The deadline is July 16, but that doesn't mean you should wait until the last minute. It doesn't cost anything to reserve. https://resweb.passkey.com/go/FLOCK2015 

The FLOCK committee is exploring some really fun options for evening events right now, so Ruth hopes to be able to tell you about those soon. Keep an eye on the website, which in the next few days Ruth and others will be updating with information about Rochester.

2015-04-02

FLOCK 2015 is in Rochester NY

This is a short blog post to get over my writer's block. For the last 2 years, Fedora has had a computer festival called FLOCK in either Europe or North America. The first FLOCK happened in beautiful Charleston SC in 2013. The second FLOCK was in the wonderful capital of the Czech Republic, Prague. This years FLOCK is to be held from August 12-15 in Rochester New York. The main website is having some issues (various links aren't pointing to the correct places because Wordpress is being obstinate) but these are being worked on as I write and hopefully will be fixed soon.

To register for the Rochester FLOCK, please go to https://register.flocktofedora.org/ and add your name and data to the list.

[Edited to fix a STUPID mistake on my part of somehow confusing Prague with Austria. Once again, mea culpa. I know better.. and I should have edited before I hit send. I was reading a history of Prague and there was a section on when it has been used in the past to stand in for ancient Vienna or other cities. PIBKAC error.]

2015-01-20

Mea Maxima Culpa

I would like to apologize for my last blog post. My original intention was to make an absurd point by proposing to drop 32 bit architectures from being primary in Fedora. I didn't communicate clearly that this was meant to be absurd. It also did not clearly state that the problem I am worried about is that with many core developers only focusing on x86_64 and hardware that is less than 4 years old that people using x86_32 and ARM32 are in effect on borrowed time.

I made things worse by then trying to defend why I was making the absurd proposal in the first place. In doing so, I muddled things further and pissed off people versus making things better.


2015-01-19

Devil's Proposal: Moving Fedora to 64 bit only in Fedora 23

[Edited: 2014-01-22 This post is meant to have been absurd with a level of Jonathan Swift Modest Proposal. Lesson learned I am no Jonathan Swift.. if I see a problem I will write about it directly instead of as what only I saw as satire. Please see the next post for my formal apology to those I wronged.]

I am going to make the uncomfortable and ugly proposal to drop 32 bit in Fedora 23 and only look at 64 bit architectures as primary architectures. All 32 bit architectures (arm7hl, i386) would be moved to being secondary architectures that would require their own build teams and 'koji' to maintain builds in future releases. At the moment that would make the only 64 bit primary architecture x86_64 with arm64 and ppc64 possible candidates for mainstream support in F24 (if they aren't ready by Fedora 23).

The main reasons I am proposing this are:

  1. i386 yum usage and usage of Fedora have not grown and actually fallen slightly over the last 2 years.  [Harder to show in this picture is that a larger number of the i386 are older releases no longer supported by Fedora] 
  2. Multiple proposed changes in builder options for security work 'faster', 'better', or at all only in 64 bit.
  3. Development and developers have been focusing on 64 bit for quite a while leading to 32 bit to being an afterthought or a "I don't have a Pentium III to try and replicate your problem with." Many active developers and projects are looking to support only 4 year old hardware that they have. 
I think that those point to a picture where people who are using 32 bit architectures are not getting well served by the distribution ( and have not been for a while). Trying to force developers to focus on them usually ends up with a passive-aggressive relationship where things are only fixed after a long 'fight' of blame versus someone who is interested in older hardware taking active charge of the problem.

Some people running 32 bit currently may feel that they are not capable of running a build system. The job at this point would be to look for the people capable and willing to do so. Maybe a patreon or some similar program will be needed. I think it will be better to start an active community towards 32 bit Linux versus alternatives where Fedora X is 64 bit only and no avenues for 32 bit are left.

This may also mean that people with older hardware end up dropping Fedora altogether and going to Debian or Arch. I would actually say that the people doing so are being active and taking control of their destinies which is better than waiting for hand-scraps. If it means that Arch has a strong 32 bit presence and users of that hardware can get better support then the users win. 

2015-01-05

My nook died and with it my dreams of going all digital.

For Christmas 2010, I bought my wife a Nook Color. We had an E-ink Nook and while I enjoyed reading books in it, there was enough colour illustrations I figured buying a colour nook would be a nice present. I also thought it would be useful for my wife to quality assurance test various book formats of her books.

It turned out that my wife was perfectly happy with the e-ink and so I ended up with it. This may sound like standard husband joke (Hey honey I know you have always wanted a bandsaw so I got one for your Christmas present), but I really didn't think I needed an e-Reader that much. However, I decided that if I have one, I might as well use it... which meant getting all of Project Gutenburg mirrored to a local computer and going through which epub books I wanted :). After reading through a couple of Icelandic sagas and various other books, I realized that ebooks saved a lot of space and I should go all-digital.

Out went a couple of book shelves worth of books, and in came their epub replacements. And this was great for my light reading, but then I got overly ambitious and traded out my O'reilly books for their epub versions. O'reilly books are some of the best technical manuals out there, but having an entire bookcase of them (with 3 versions of the sendmail book, 2 copies of the Unix admin book, various versions of the Awk and Sed book) seemed to be a waste of space.

And so a bunch of purchases later, nearly everything went out the door and my disk drives got various versions of them. Only later did I remember that some of those books had been signed versions ("Hey guess what I found at the library swap meet? A signed copy of 1st edition Sendmail book!" Ohhh thats where that went.)

And then I realized a very big problem. Trying to read a technical book on a small 7 inch tablet does not work for me. Between the console dimming every couple of minutes while I pour over the page and try to figure out what I need to do next on the computer or the fact that I am constantly turning the tablet right sided to get various text that is too long for the screen.. I ended up finding the tablet didn't work well. Then I tried my other favourite medium, comics, and found again the size factor was too small to be useful. And while I enjoy reading books on eink, the text on a screen just seemed to add up to headaches after a while (or I would end up sitting in a place with too much sun and the screen would be useless).

The final straw was that I missed the smell and feel of the books. Leafing through a book has a visceral quality in the same way that listening to vinyl has to some people (those who don't have a tintinitus from sitting in various computer machine rooms over the last 24 years). In the end, the nook was used less and less even when I travelled. If I want to read something, I actually use a reader on my phone which I liked much more than the limited settings of the Nook. If I need to look at a manual, I use it on the computer or on a second screen connected to it. (or I pull it from the shelf and sit in the sun and read it for a couple of hours.)

In the end, the nook ended up in a drawer where I found it last week. It's lithium battery was dead and wouldn't hold a charge and the prospect of buying a $70.00 special toolkit+battery to fix it wasn't really all that appealing. So it is off to be donated to some organization which takes these boxes and either fixes them or sends them to get recycled.

I may end up looking at a 10 inch tablet tool someday, but for now I would just like a very large colour eink reader :)

How I spent my Winter Vacation

So the Winter of 2015 starts at the end of 2014, and my current employer, Red Hat, likes to have a mandatory close down at the end of the year of about a week. A long long time ago, (say around 1999? or 1998?), we started this tradition for a couple of reasons:


  1. It saved money when money was tight. In 1999 we were a small company of less than 200 people. Most of our 'sales' were in the retail industry of various big-box companies buying a ton of boxed sets which they would then return unsold ones for refund. This meant that cash flow was rather tight at the end of the year when returns from Red Hat Linux 5.1 might show up from Best Buy or CompUSA. 
  2. Not a lot could get accomplished at the company as enough people would be on vacation at the end of the year that you ended up in spinlocks waiting for XYZ person to return. Having various business decisions stopped for two weeks but having people sitting at their desks was a waste of everyone's time.
  3. Another reason given was due to US tax laws and vacation time, but I don't know how accurate it is. Vacation time whether it is taken or not is taxable so having people make sure they use it meant that the company was being a good custodian of investor's money.
Of course then end of 1999 was a fun year as we had the Y2K problem to deal with and various contracts with IBM and SGI for round the clock support in case Linux systems failed. So the support people who had the most unused vacation time, ended up not using it during the break. I ended up taking all my vacation time after it coming back to work in late February I think. 

Anyway, inaccurate history aside, I have usually used the last 2 weeks of December to take some time to spend with the family, get the last minute Christmas cards and shopping done, and then try to get some sort of big project started for the New Year. In the end, I usually come down with a cold or something which sidelines me. This year I got the cold, but did manage to get all the various projects done (even if the cards out the door before Orthodox Christmas on Jan 7). My project for the holidays was to go through 24 years of backups and see what I really wanted to keep.

Why was I keeping around various source code from GNU projects in 1993 which I doubt would either build or be useful :). And why didn't I keep the source code for the project I was looking for dominion (a Conquer clone written by Mark Galassi, et al). I did find I had a ton of dot files which I have basically mangled since 1989 in one form or another. I decided to take the various snapshots in the backups and check them in historically using a set of scripts to time-stamp the files, md5sum those files, remove duplicates (eg if the md5sum was the same only check in one version) and then a script which basically copied a file into git and then moved to the next one.

I doubt the code is useful to anyone, but for my own memory....


export TARGET="_bashrc"; 
for i in $(awk '{print $3}' ~/file_list );  do
 /bin/cp $i ${TARGET} ; 
 MESSAGE=$( grep "${i}$" ~/file_list );
 git commit -a -m "$MESSAGE" ;
 echo "========";
done

file_list has the contents like:

2014-07-16 17b45ad44d7734ecfa11bc2bb4ce7171 /srv/backups/Laptop03/root/.bashrc
2014-09-25 2f8222b4f275c4f18e69c34f66d2631b /srv/backups/Laptop03/etc/skel/.bashrc
2014-10-28 e078c68f866127d265a84fa8397b9828 /srv/backups/Laptop03/home/smooge/.bashrc

When you do this sort of thing, make sure that you check in files which you have looked at first. I almost checked in a couple of files which had passwords in them. While I don't use those passwords anymore.. I sure made horrible choices in passwords: '7son@7So' because DEScrypt stops at 8 characters (not that the 9th character 'n' would have made the password any securer.

Anyway, I had a great vacation (even for the cold and pulled back from bowling) and will work on making 2015 great too.