2015-01-20

Mea Maxima Culpa

I would like to apologize for my last blog post. My original intention was to make an absurd point by proposing to drop 32 bit architectures from being primary in Fedora. I didn't communicate clearly that this was meant to be absurd. It also did not clearly state that the problem I am worried about is that with many core developers only focusing on x86_64 and hardware that is less than 4 years old that people using x86_32 and ARM32 are in effect on borrowed time.

I made things worse by then trying to defend why I was making the absurd proposal in the first place. In doing so, I muddled things further and pissed off people versus making things better.


2015-01-19

Devil's Proposal: Moving Fedora to 64 bit only in Fedora 23

[Edited: 2014-01-22 This post is meant to have been absurd with a level of Jonathan Swift Modest Proposal. Lesson learned I am no Jonathan Swift.. if I see a problem I will write about it directly instead of as what only I saw as satire. Please see the next post for my formal apology to those I wronged.]

I am going to make the uncomfortable and ugly proposal to drop 32 bit in Fedora 23 and only look at 64 bit architectures as primary architectures. All 32 bit architectures (arm7hl, i386) would be moved to being secondary architectures that would require their own build teams and 'koji' to maintain builds in future releases. At the moment that would make the only 64 bit primary architecture x86_64 with arm64 and ppc64 possible candidates for mainstream support in F24 (if they aren't ready by Fedora 23).

The main reasons I am proposing this are:

  1. i386 yum usage and usage of Fedora have not grown and actually fallen slightly over the last 2 years.  [Harder to show in this picture is that a larger number of the i386 are older releases no longer supported by Fedora] 
  2. Multiple proposed changes in builder options for security work 'faster', 'better', or at all only in 64 bit.
  3. Development and developers have been focusing on 64 bit for quite a while leading to 32 bit to being an afterthought or a "I don't have a Pentium III to try and replicate your problem with." Many active developers and projects are looking to support only 4 year old hardware that they have. 
I think that those point to a picture where people who are using 32 bit architectures are not getting well served by the distribution ( and have not been for a while). Trying to force developers to focus on them usually ends up with a passive-aggressive relationship where things are only fixed after a long 'fight' of blame versus someone who is interested in older hardware taking active charge of the problem.

Some people running 32 bit currently may feel that they are not capable of running a build system. The job at this point would be to look for the people capable and willing to do so. Maybe a patreon or some similar program will be needed. I think it will be better to start an active community towards 32 bit Linux versus alternatives where Fedora X is 64 bit only and no avenues for 32 bit are left.

This may also mean that people with older hardware end up dropping Fedora altogether and going to Debian or Arch. I would actually say that the people doing so are being active and taking control of their destinies which is better than waiting for hand-scraps. If it means that Arch has a strong 32 bit presence and users of that hardware can get better support then the users win. 

2015-01-05

My nook died and with it my dreams of going all digital.

For Christmas 2010, I bought my wife a Nook Color. We had an E-ink Nook and while I enjoyed reading books in it, there was enough colour illustrations I figured buying a colour nook would be a nice present. I also thought it would be useful for my wife to quality assurance test various book formats of her books.

It turned out that my wife was perfectly happy with the e-ink and so I ended up with it. This may sound like standard husband joke (Hey honey I know you have always wanted a bandsaw so I got one for your Christmas present), but I really didn't think I needed an e-Reader that much. However, I decided that if I have one, I might as well use it... which meant getting all of Project Gutenburg mirrored to a local computer and going through which epub books I wanted :). After reading through a couple of Icelandic sagas and various other books, I realized that ebooks saved a lot of space and I should go all-digital.

Out went a couple of book shelves worth of books, and in came their epub replacements. And this was great for my light reading, but then I got overly ambitious and traded out my O'reilly books for their epub versions. O'reilly books are some of the best technical manuals out there, but having an entire bookcase of them (with 3 versions of the sendmail book, 2 copies of the Unix admin book, various versions of the Awk and Sed book) seemed to be a waste of space.

And so a bunch of purchases later, nearly everything went out the door and my disk drives got various versions of them. Only later did I remember that some of those books had been signed versions ("Hey guess what I found at the library swap meet? A signed copy of 1st edition Sendmail book!" Ohhh thats where that went.)

And then I realized a very big problem. Trying to read a technical book on a small 7 inch tablet does not work for me. Between the console dimming every couple of minutes while I pour over the page and try to figure out what I need to do next on the computer or the fact that I am constantly turning the tablet right sided to get various text that is too long for the screen.. I ended up finding the tablet didn't work well. Then I tried my other favourite medium, comics, and found again the size factor was too small to be useful. And while I enjoy reading books on eink, the text on a screen just seemed to add up to headaches after a while (or I would end up sitting in a place with too much sun and the screen would be useless).

The final straw was that I missed the smell and feel of the books. Leafing through a book has a visceral quality in the same way that listening to vinyl has to some people (those who don't have a tintinitus from sitting in various computer machine rooms over the last 24 years). In the end, the nook was used less and less even when I travelled. If I want to read something, I actually use a reader on my phone which I liked much more than the limited settings of the Nook. If I need to look at a manual, I use it on the computer or on a second screen connected to it. (or I pull it from the shelf and sit in the sun and read it for a couple of hours.)

In the end, the nook ended up in a drawer where I found it last week. It's lithium battery was dead and wouldn't hold a charge and the prospect of buying a $70.00 special toolkit+battery to fix it wasn't really all that appealing. So it is off to be donated to some organization which takes these boxes and either fixes them or sends them to get recycled.

I may end up looking at a 10 inch tablet tool someday, but for now I would just like a very large colour eink reader :)

How I spent my Winter Vacation

So the Winter of 2015 starts at the end of 2014, and my current employer, Red Hat, likes to have a mandatory close down at the end of the year of about a week. A long long time ago, (say around 1999? or 1998?), we started this tradition for a couple of reasons:


  1. It saved money when money was tight. In 1999 we were a small company of less than 200 people. Most of our 'sales' were in the retail industry of various big-box companies buying a ton of boxed sets which they would then return unsold ones for refund. This meant that cash flow was rather tight at the end of the year when returns from Red Hat Linux 5.1 might show up from Best Buy or CompUSA. 
  2. Not a lot could get accomplished at the company as enough people would be on vacation at the end of the year that you ended up in spinlocks waiting for XYZ person to return. Having various business decisions stopped for two weeks but having people sitting at their desks was a waste of everyone's time.
  3. Another reason given was due to US tax laws and vacation time, but I don't know how accurate it is. Vacation time whether it is taken or not is taxable so having people make sure they use it meant that the company was being a good custodian of investor's money.
Of course then end of 1999 was a fun year as we had the Y2K problem to deal with and various contracts with IBM and SGI for round the clock support in case Linux systems failed. So the support people who had the most unused vacation time, ended up not using it during the break. I ended up taking all my vacation time after it coming back to work in late February I think. 

Anyway, inaccurate history aside, I have usually used the last 2 weeks of December to take some time to spend with the family, get the last minute Christmas cards and shopping done, and then try to get some sort of big project started for the New Year. In the end, I usually come down with a cold or something which sidelines me. This year I got the cold, but did manage to get all the various projects done (even if the cards out the door before Orthodox Christmas on Jan 7). My project for the holidays was to go through 24 years of backups and see what I really wanted to keep.

Why was I keeping around various source code from GNU projects in 1993 which I doubt would either build or be useful :). And why didn't I keep the source code for the project I was looking for dominion (a Conquer clone written by Mark Galassi, et al). I did find I had a ton of dot files which I have basically mangled since 1989 in one form or another. I decided to take the various snapshots in the backups and check them in historically using a set of scripts to time-stamp the files, md5sum those files, remove duplicates (eg if the md5sum was the same only check in one version) and then a script which basically copied a file into git and then moved to the next one.

I doubt the code is useful to anyone, but for my own memory....


export TARGET="_bashrc"; 
for i in $(awk '{print $3}' ~/file_list );  do
 /bin/cp $i ${TARGET} ; 
 MESSAGE=$( grep "${i}$" ~/file_list );
 git commit -a -m "$MESSAGE" ;
 echo "========";
done

file_list has the contents like:

2014-07-16 17b45ad44d7734ecfa11bc2bb4ce7171 /srv/backups/Laptop03/root/.bashrc
2014-09-25 2f8222b4f275c4f18e69c34f66d2631b /srv/backups/Laptop03/etc/skel/.bashrc
2014-10-28 e078c68f866127d265a84fa8397b9828 /srv/backups/Laptop03/home/smooge/.bashrc

When you do this sort of thing, make sure that you check in files which you have looked at first. I almost checked in a couple of files which had passwords in them. While I don't use those passwords anymore.. I sure made horrible choices in passwords: '7son@7So' because DEScrypt stops at 8 characters (not that the 9th character 'n' would have made the password any securer.

Anyway, I had a great vacation (even for the cold and pulled back from bowling) and will work on making 2015 great too.

Welcome to 2015. Help Make it a better year

Hi everyone,

I am going to try and do at least one blog post every week to cover what is going on in Fedora Infrastructure (or at least my part of it.), Extra Packages for Enterprise Linux (EPEL), Software Collections for Enterprise Linux (SCEL), and other areas of Free, Libre, and Open Source Software (FLOSS).

On the first of January, I saw a tweet from someone whose attribution I have lost.. The tweet basically said "Instead of wishing someone a happy New Year, ask them to make it a happy New Year. Acknowledge that they have control of their future." It was one of those statements that hit me, and made me realize that yes, I can't wish for a better future, but I can make it so. [I was going to say try, but remembered Yoda: Try not. Do... or do not. There is no try.] So this year I am going to make 2015 better in whatever way I can, and I hope you, reader, will do so to.


2014-10-24

How to know what generation you were born in.

So in 2011, I wrote an article about 'the generations'. Not much has changed much except the current buzz word in marketing is about the 'millennial'. The millennial's are doing this, the millennial's are doing that...

Before anyone starts wondering if this is going to be a 'get off my lawn' article, I have no problem with this. The Generation X thing was so past its expiration date what with even the youngest Generation X person now in their mid-30's. [And thus too old to be marketed to by most magazines and online-journalists.]

My main problem is that when people define where the cut-off for being a 'millenial' is. Depending on the article it ranges from being born from anywhere from 1980 to 2009. [I wonder if the discrepancy is to make sure the author of the piece is still in the 'young and hip' demographic versus being an old flake of the last generation.

OK for any person wanting to know what generation they or some ancestor 'belonged to.. here is a handy reference guide. Like my original article.. it is full of poop, because it is EuroAmerican and doesn't count the 10,000's of generations that various American Indian tribes had before someone tried to find a short cut to China by going the long way around the world. [It also doesn't count in the Norse colonies from the 900's or so.] Instead it uses the one 'defined' generation of the Baby Boomers as the counting point to start backwards and future with and then using X as the generation after that. It is cool that the Generation A would have been some of the first 'Americans'


Generation:  A 1550 -> 1567 (New Mexico 'colonies')
Generation:  B 1568 -> 1585 (St Augustine Fl and first known birth)
Generation:  C 1586 -> 1603 (Roanoke Island)
Generation:  D 1604 -> 1621 (Jamestown)
Generation:  E 1622 -> 1639
Generation:  F 1640 -> 1657
Generation:  G 1658 -> 1675
Generation:  H 1676 -> 1693
Generation:  I 1694 -> 1711
Generation:  J 1712 -> 1729
Generation:  K 1730 -> 1747 (The Founding Parents)
Generation:  L 1748 -> 1765 (The Revolution Fighters)
Generation:  M 1766 -> 1783 (The Last Colonials)
Generation:  N 1784 -> 1801 (The War of 1812 Generation)
Generation:  O 1802 -> 1819
Generation:  P 1820 -> 1837
Generation:  Q 1838 -> 1855 (The Civil War Generation)
Generation:  R 1856 -> 1873
Generation:  S 1874 -> 1891 (Greater Collapse of 1892 gen)
Generation:  T 1892 -> 1909 (Lost Generation of WWI)
Generation:  U 1910 -> 1927 (Greatest Generation of WWII)
Generation:  V 1928 -> 1945 (Silent Generation)
Generation:  W 1946 -> 1963 (Baby Boomers)
Generation:  X 1964 -> 1981 (Generation X)
Generation:  Y 1982 -> 1999 (Millenials)
Generation:  Z 2000 -> 2017 (the "not the last generation")
Generation: AA 2018 -> 2035 (rebuilders from the Unix Apocalypse)
Generation: AB 2036 -> 2063

So hope this is useful. [I probably need an app for this.. just to make sure the millenials get it... crap some kids are on my lawn again.. HEY YOU!!!!]

2014-09-19

How to work with CentOS-5 in a CentOS-7 mock shell

So I have been spending a lot of time lately working on Extra Packages for Enterprise Linux (EPEL) as I know it is one of the undersold success stories of Fedora.

In doing so I have been focusing on EPEL-5 as it is the oldest release and something that most packagers do not actually think about (as they are usually focusing on the latest and greatest in Fedora or maybe Enterprise Linux-7). This has been a trip down memory lane as I have had to deal with things like ancient Python and a yum without the same command set or tools as 'current' ones.

One of the big things I have to remember is that EL-5 is based off Fedora 6 (zod) and so its RPM database is a different format than what is in any Fedora after version 8 (I believe). I rediscovered this format change when I was trying to see what packages in EPEL might replace 'core' packages in a bare bones CentOS-5 install. I was using mock to do this which uses the host system (in my case a EL-7 box) to populate the buildroot with packages from EL-5 tree.

I ran into this when I did a
[smooget@junk02 rpm]$ mock -r local-5-i386 --init
[smooget@junk02 rpm]$ mock -r local-5-i386 --shell
[root@junk02 /]# rpm -qa


I got an error telling me that the database type(9) was unknown. After spending some time working through various people running into this on google, I was able to piece together the following on how to work with the rpm inside the mock shell appropriately.
  1. Before entering into the shell, dump the databases with the system db_dump
    [smooget@junk02 rpm]$ mock -r local-5-i386 --install db4-utils
    [smooget@junk02 rpm]$ sudo -i
    [root@junk02 rpm]# cd /srv/mock/tree/local-5-i386/root/var/lib/rpm # this is not DEFAULT location
    [root@junk02 rpm]# for i in Basenames Conflictname Dirnames Group Installtid Name Obsoletename Packages Providename Requirename Sha1header Sigmd5 Triggername ; do echo $i; db_dump $i > $i.x; done
    Basenames
    Conflictname
    Dirnames
    Group
    Installtid
    Name
    Obsoletename
    Packages
    Providename
    Requirename
    Sha1header
    Sigmd5
    Triggername
    
  2. Now we can load all those db (really on Packages is needed but I like to be a completist)
    [smooget@junk02 rpm]$  mock -r local-5-i386 --shell
    [root@junk02 /]# cd /var/lib/rpm
    [root@junk02 /]# for i in Basenames    Conflictname    Dirnames    Group    Installtid    Name    Obsoletename    Packages    Providename    Requirename    Sha1header    Sigmd5    Triggername; do echo $i; rm -v $i; cat $i.x | db_load $i; done
    [root@junk02 /]# rpm --rebuilddb
    [root@junk02 /]# rpm -qa 
    should give you a list of packages.
    
Note once you have done this you can not use mock to install packages anymore. If you need to install more packages make sure you have installed yum before you do this.

So what does this give you if you have completed this and installed EVERY possible package (choosing one set in case of conflicts like samba/samba3)




=============================================================================================================================
 Package                          Arch                    Version                            Repository                 Size
=============================================================================================================================
Updating:
 agg                              i386                    2.5-9.el5                          extras                    147 k
 agg-devel                        i386                    2.5-9.el5                          extras                    368 k
 fribidi                          i386                    0.19.2-2.el5                       extras                     53 k
 fribidi-devel                    i386                    0.19.2-2.el5                       extras                     53 k

Transaction Summary
=============================================================================================================================
Install       0 Package(s)
Upgrade       4 Package(s)

Total size: 620 k
Is this ok [y/N]: 



So it looks like my next step will be to a) see if RHEL-5.10 updated those packages and if not have them 'removed' or something else. Also I need to figure out a better way of doing this so we can have a koji test to make sure the EPEL package doesn't ever get in the first place.