One of the few things I miss from the years when I used MS-DOS is WordPerfect 5.1. It was a perfect application for writing and few WYSIWYG word processors if any have been as userfriendly as WP was. For the last seven or eight years I've been constantly looking for a perfect word processor. I have used KWord, Abiword, OpenOffice, LyX and even Emacs for writing. Sometimes, I have planned to install a copy of WordPerfect under FreeDOS just to recreate the simple desktop with no distractions like Firefox. But there were, of course, things like Tetris and Civilization that sometimes kept me from writing for days.
Today I decided to test yet another Linux word processor -- WordGrinder is probably the one and only console mode word processor that exists for Linux. It is as simple as it gets, but not too simple. WordGrinder uses a binary format for its files, which makes them unreadable for ordinary text editors. It can, however, export the files to plain text and html. The latest version (0.3) should even export the text to LaTeX and troff for the geekiest users. Unfortunately, my Debian still has the version 0.2 that does not support all of the fancy stuff mentioned in the web site for WordGrinder.
Lightweight Linux is a blog about using Linux on old computers. Lightweight distributions and applications bring your old hardware back to life!
Slackware 13.0 Released!
Slackware is the oldest surviving Linux distribution. Now Patrick Volkerding has released version 13.0 of this distribution that aims to keep things simple.
The biggest news in the release announcement is the addition of the official 64-bit port. Of course, this should not be very exciting for the readers of Lightweight Linux as we are happy to continue using our 32-bit hardware. In addition to the KDE4.2.4, Slackware 13.0 provides the users with XFCE 4.6.1. And of course, you can always choose to install some of the lightweight window managers if you intend to use Slack with an old computer.
The biggest news in the release announcement is the addition of the official 64-bit port. Of course, this should not be very exciting for the readers of Lightweight Linux as we are happy to continue using our 32-bit hardware. In addition to the KDE4.2.4, Slackware 13.0 provides the users with XFCE 4.6.1. And of course, you can always choose to install some of the lightweight window managers if you intend to use Slack with an old computer.
WolframAlpha is a Computational Knowledge Engine
Just a short notice today.
WolframAlpha is a computational knowledge engine that seems to be able to compute an answer to many of my problems. For example, it can compare stocks and do all kinds of fancy math.
WolframAlpha is a computational knowledge engine that seems to be able to compute an answer to many of my problems. For example, it can compare stocks and do all kinds of fancy math.
Frankenstein's Laptop
As you are reading this blog, I believe you have at least one or two spare computers you cannot find any real use at the moment. If you are not interested in following K. Mandla's footsteps and use Crux and Awesome with an ancient laptop of 120 MHz & 16 Mb, you might try this "Frankenstein hack" and build a cool slim desktop out of your old laptop!
Or maybe you could combine the two ideas?
Or maybe you could combine the two ideas?
Beginning the Linux Command Line
I finally got my hands on a copy of Sander van Vugt's Beginning the Linux Command Line. It really is a book can warmly recommend to those of my readers who want to learn the basics of administering a Linux system using command line only. This is, of course, a necessary skill for anyone who intends to use Linux with an old computer that cannot comfortably run the modern desktop distributions that provide the users with point-and-click setup tools for almost every imaginable configuration option.
The author Sander van Vugt is a consultant specialized in Linux high availability, storage solutions, and performance problems. This is not his first book, as he has published several books about Linux-related subjects before Beginning the Linux Command Line. In addition to the books about Linux servers, he has written articles for several web sites and magazines such as Linux Journal and Linux Magazine.
According to the author, the book was written for anyone who wants to master Linux using the command line. This includes system administrators, software developers and enthusiastic users who are interested in getting things going from the command line.
The audience of the book is not limited to any specific distro, as the book is distribution agnostic -- everything in the book has been checked against Ubuntu, Red Hat, and SUSE. This means that the book should be useful with most of the main stream distributions. Only users of some of the more exotic distributions like Gentoo, Arch, and to some extent Slackware might be worried about the selection of the three distributions. On the other hand, the users of Gentoo and Arch probably do not need a book written for command line newbies.
In fact, the book could well be read by anyone who is not familiar with the concepts of open source and different Linux distributions as the first chapter of the book begins with the introduction of these concepts before even logging into an installed Linux system. After logging into a running system, the author starts by explaining the structure of commands and their options, piping and redirection and how to get more help with the man command and --help option.
Only after explaining the very basics, the author moves forward to system administration beginning from changing the password, working with virtual consoles and becoming another user. After this, the command line newbie learns about how to obtain information about other users, how to communicate with them and and how to move around the file system. Unlike most of the command line books, this one really does not expect any previous familiarity with the command line!
Reading the next chapters, the command line newbie will learn about administering the file system: mounting the disks, checking file system integrity, creating backups and working with links. Even the more experienced Linux users might learn something new -- for example I have never used a LVM and the fifth chapter gave me an overview of logical volumes.
The sixth chapter is dedicated to managing users and groups. More experienced Linux users might find the discussion of user quotas and authentication methods the most interesting parts of the chapter.
In the seventh chapter, the newbie learns to manage permissions. The eight chapter teaches him/her to manage software, using either .deb or .rpm packages and respective package management tools. The next chapter is about process and system management and the tenth chapter about system logs -- both are interesting reading for anyone willing to learn about basic system administering.
The next chapters are a quick introduction in using Linux as server. The eleventh chapter is about configuring the network and the twelfth chapter teaches us how to configure a file server. And if this is not enough, in the final two chapters you will learn about tuning the kernel and about basic shell scripting.
I can only recommend this book warmly for anyone who has used Linux as desktop but never bothered to look at the command line and the rich possibilities provided by it.
Even a basic knowledge of the command line is essential for those who want to use an old computer with a custom lightweight Linux installation. After reading this book, you are one step closer to being a power user who can install and use the more exotic distributions on old computers. If you belong to the target group of the book, you will certainly learn a lot from it. If you are more experienced Linux user, you might still find the chapters about servers and kernel useful.
Order the book directly from Amazon.
The author Sander van Vugt is a consultant specialized in Linux high availability, storage solutions, and performance problems. This is not his first book, as he has published several books about Linux-related subjects before Beginning the Linux Command Line. In addition to the books about Linux servers, he has written articles for several web sites and magazines such as Linux Journal and Linux Magazine.
According to the author, the book was written for anyone who wants to master Linux using the command line. This includes system administrators, software developers and enthusiastic users who are interested in getting things going from the command line.
The audience of the book is not limited to any specific distro, as the book is distribution agnostic -- everything in the book has been checked against Ubuntu, Red Hat, and SUSE. This means that the book should be useful with most of the main stream distributions. Only users of some of the more exotic distributions like Gentoo, Arch, and to some extent Slackware might be worried about the selection of the three distributions. On the other hand, the users of Gentoo and Arch probably do not need a book written for command line newbies.
In fact, the book could well be read by anyone who is not familiar with the concepts of open source and different Linux distributions as the first chapter of the book begins with the introduction of these concepts before even logging into an installed Linux system. After logging into a running system, the author starts by explaining the structure of commands and their options, piping and redirection and how to get more help with the man command and --help option.
Only after explaining the very basics, the author moves forward to system administration beginning from changing the password, working with virtual consoles and becoming another user. After this, the command line newbie learns about how to obtain information about other users, how to communicate with them and and how to move around the file system. Unlike most of the command line books, this one really does not expect any previous familiarity with the command line!
Reading the next chapters, the command line newbie will learn about administering the file system: mounting the disks, checking file system integrity, creating backups and working with links. Even the more experienced Linux users might learn something new -- for example I have never used a LVM and the fifth chapter gave me an overview of logical volumes.
The sixth chapter is dedicated to managing users and groups. More experienced Linux users might find the discussion of user quotas and authentication methods the most interesting parts of the chapter.
In the seventh chapter, the newbie learns to manage permissions. The eight chapter teaches him/her to manage software, using either .deb or .rpm packages and respective package management tools. The next chapter is about process and system management and the tenth chapter about system logs -- both are interesting reading for anyone willing to learn about basic system administering.
The next chapters are a quick introduction in using Linux as server. The eleventh chapter is about configuring the network and the twelfth chapter teaches us how to configure a file server. And if this is not enough, in the final two chapters you will learn about tuning the kernel and about basic shell scripting.
I can only recommend this book warmly for anyone who has used Linux as desktop but never bothered to look at the command line and the rich possibilities provided by it.
Even a basic knowledge of the command line is essential for those who want to use an old computer with a custom lightweight Linux installation. After reading this book, you are one step closer to being a power user who can install and use the more exotic distributions on old computers. If you belong to the target group of the book, you will certainly learn a lot from it. If you are more experienced Linux user, you might still find the chapters about servers and kernel useful.
Order the book directly from Amazon.
Rip CDs on the command line: ripit
I try to use simple command line tools for many tasks that most readers probably accomplish with the default tools provided by GNOME or KDE. Thus I like to listen to the ogg vorbis music files using ogg123 and I rip my CDs using ripit.
Ripit is a command line tool written in Perl. It is a front-end to several other command line tools and provides them an easy to use interface that is as simple as it can be. It can fetch the CD title and track information from CDDB, rip the CD, tag the songs, create a playlist and edit the CDDB information, if necessary. Usually I just accept the defaults provided by the application.
Simple and effective.
Ripit is a command line tool written in Perl. It is a front-end to several other command line tools and provides them an easy to use interface that is as simple as it can be. It can fetch the CD title and track information from CDDB, rip the CD, tag the songs, create a playlist and edit the CDDB information, if necessary. Usually I just accept the defaults provided by the application.
Simple and effective.
Random Thoughts: Backing Up and More
It is August again, and it seems I have less time for blogging every week. At the moment I have to devote most of my time to studying economics and business administration. Hence I rarely have time to test new applications, not to speak about testing new distributions. I need operating systems, applications and distributions that just work.
Unfortunately, this means that I just cannot spend an evening or a weekend with configuring some of the more interesting distributions I have really tried to use. Crux and Arch are two distros that I most probably would enjoy using, but at the moment I'm going to continue using distributions that I've used since the beginning of my geek years: openSUSE and Debian. SuSE 7.1 or 7.2 was the first Linux I installed to be used as my main desktop in 2001 at the University where I worked as a researcher. Since then, I've tried and used most of the mainstream distros at least for a year or two on desktop either at work or home.
At home, I use openSUSE 11.1 as my main desktop. It is supported by a Debian box that at the moment serves me as a quick & dirty backup for my files. I have not had time to build anything fancy: I did a command line only installation on the box that can be used only over ssh in the LAN.
I back up my desktop simply by first turning the Debian box on, and then I execute a very simple script that basically rsyncs the /home partition on openSUSE with the remote /home/backups/ on the Debian box. So I have not yet built any automated backup system with incremental or differential backups, I just mirror the files of the ordinary users on another computer.
The backups can "easily" be restored by sftp or by mounting the Samba share provided by the Debian box. I decided to use Samba instead of a purely Linux solution, because I wanted to be able to copy the backups even to a XP laptop and my eMac. I would love to have a fully functioning system that can sync the files between several computers in different locations, but at the moment the project has to rest for some time.
Unfortunately, this means that I just cannot spend an evening or a weekend with configuring some of the more interesting distributions I have really tried to use. Crux and Arch are two distros that I most probably would enjoy using, but at the moment I'm going to continue using distributions that I've used since the beginning of my geek years: openSUSE and Debian. SuSE 7.1 or 7.2 was the first Linux I installed to be used as my main desktop in 2001 at the University where I worked as a researcher. Since then, I've tried and used most of the mainstream distros at least for a year or two on desktop either at work or home.
At home, I use openSUSE 11.1 as my main desktop. It is supported by a Debian box that at the moment serves me as a quick & dirty backup for my files. I have not had time to build anything fancy: I did a command line only installation on the box that can be used only over ssh in the LAN.
I back up my desktop simply by first turning the Debian box on, and then I execute a very simple script that basically rsyncs the /home partition on openSUSE with the remote /home/backups/ on the Debian box. So I have not yet built any automated backup system with incremental or differential backups, I just mirror the files of the ordinary users on another computer.
The backups can "easily" be restored by sftp or by mounting the Samba share provided by the Debian box. I decided to use Samba instead of a purely Linux solution, because I wanted to be able to copy the backups even to a XP laptop and my eMac. I would love to have a fully functioning system that can sync the files between several computers in different locations, but at the moment the project has to rest for some time.
Subscribe to:
Posts (Atom)