Sharing Files with Samba the Easy Way

Samba

Yeah, I hear you talking but I don’t believe a word. “You say you’re gonna keep it simple.” “You say you’re gonna stick to just having one Linux machine to play around with and you’re not gonna need to think too much about networking.” Well, let me tell you that it just doesn’t work that way at all.

You see, there is one dirty little secret that long time Linux users know but keep to themselves. Something no one tells newbies. But I will. So here it is: Linux is highly addictive, and having one Linux machine is like eating just one potato chip. You can’t do it, I tell you! Second-hand computers are cheap and Linux is a free download, so there’s nothing stopping you from finding yourself with a house full of happy Linux boxes, all humming away. There’s always another reason to add one more machine… Just wait, you’ll see.

Another dirty little secret they won’t tell you is that Samba, the almost universally accepted software used to interface Linux with Windows style peer to peer networks, is a complex program. That it’s hard to configure and is a source of frustration for even the most advanced Linux user at times. Yes, there are GUI based tools that promise a point and click setup of Samba shared folders, but they always disappoint in the end. To make things worse, the average home user will find a dizzying array of setup options and configuration schemes when they look on-line for guidance. I struggled with Samba for a long time until I ran into a very simple solution that makes working with it a breeze. This article and video will show you how to setup a client/server style network that will work whether you have 2 machines or 200.

Getting the server software is as easy as installing the samba package from your distro’s repositories. Once it’s installed, the hard part is getting it configured to do your bidding. If we want to have a trouble free networking experience, it’ll take a bit of thought and we need to set some goals.

Goal One: Keep it simple

I once took a job at a large radio and TV facility with several studios and offices. Needless to say, there were a lot of computers; some were for general office work and others were highly specialized production and automation machines. The production network was well designed, all going by the guidelines set by the company that provided the automation software. No worries for me there. However, the office network was a mess. Basically, every desk had a PC and every PC was set up to share files and printers in a Workgroup. Making it so everyone could share everything. There were lots of machines on the network with lots of folders shared on each one. There was no way to know where anything was so you’d have to call the person you wanted to share something with and have them click where you told them to find it. It was a mess and something had to be done. The solution was super simple and I have setup every network the same way since.

I grabbed one of the extra PC’s collecting dust in the storage room and stripped it down to be a lean, mean serving machine. All this thing did was sit in the corner and share files, as it added bonus it needed very little attention. This machine shared one folder only. Each person in the building had a folder within that folder and everyone had access to everything. Instead of each workstation sharing a folder, they just had a shortcut on their desktop to the server. Anyone who wanted to share files could simply drag them into their own corresponding folder, or drop the file in someone else’s folder on the server. By doing so, anyone on the network could access them. Files on the server could be added, deleted or modified in place. Everyone knew that what they put on the server was public and that they needed to keep local copies of stuff they didn’t want to lose, just in case something got blown off the server or it crashed. With a memo and a meeting to explain how it worked the office network was suddenly much more efficient and people finally knew where to look for the files, they wanted to share. I was a star and everybody loved me.

Most home users won’t want to set aside a separate machine just to serve files. Luckily, it’s no big deal because a simple Samba server can run in the background on whatever machine you choose. Preferably one that is online the most. You can also share things like music and video folders in such a way that anyone can read or copy the files, but they can’t delete them or change them. You don’t even have to be logged into the account you setup for the server or shared files from; those files will be available on the network as long as that machine is up and running. Someone else can even be using their own account on it too. The Samba server doesn’t care.

One of the really nice things about networking this way is that only one machine has to be configured as a server. Pretty much every Linux distro comes with a Samba client setup already and all you have to do is browse the network from a file manager to connect to any machine serving files. You could even setup multiple machines if you so desired.

Goal Two: Keep it real

Do you have a bunch of wireless devices that need to be hooked up to the Internet? Do you have more than one computer that needs Internet access? Most likely, the answer is yes and you probably do all that with a router. That router probably acts as a hardware firewall and the only way for anyone to get to your devices would be for them to get into your local network. Which is fine, so long as you have a strong Wi-Fi password and don’t give it out to everyone that passes by. If your password is secure, then you can feel relatively safe when it comes to local network shares. That said, there’s really no need to make users log into a shared folder just to play a song or copy a picture that you yourself have designated as shareable. Many of the pitfalls we encounter with Samba have to do with security features, so why not just turn them off?

Another major issue with Samba, even when sharing with Windows computers, is file permissions. By default, Samba tries to preserve file permissions. The standard Samba setup will mean that files you pull off of the network will have to be copied to a local folder to make them belong to you or manually edited to change permissions, by using root privileges in most cases. This is a pain and could be really confusing for less savvy users on your network. What most home users want is to just be able to copy a file onto a local machine and have it be theirs. . . so that’s what we’ll do.

A Word About Windows

Samba was designed to share files between Windows and Linux and it works quite well most of the time. Setting up a Windows machine to work with Samba is not something I’m prepared to get into, but it is worth noting that Windows 7 and up presents some challenges. Microsoft came up with a new network protocol called Homegroup that is totally incompatible with Samba. You’ll have to configure your Windows machine to use standard Workgroup style enterprise network protocol just to interact with Samba. There’s lots of info available on how to do this and it’s not too terribly hard to get going.

A curious quirk of sharing files with Windows is that sometimes a Linux file name won’t work with Windows. The naming conventions are just slightly different. Windows doesn’t care about the case of the text when dealing with files and Linux does. Also, there are some characters that Linux will gladly let you use in a file name that Windows won’t. Samba tries to reconcile these issues with mixed results. You may find yourself having to rename a file every once in a while to get it through the Samba network, even when going from one Linux machine to another Linux machine. One of the ways to get around these bugaboos is to take any files that absolutely must retain the exact same properties and throw them into a tar.gz file before putting them on the network. This is a good thing to keep in mind when using Samba to share configurations files. that you want to import into another machine.

https://www.youtube.com/watch?v=n-uSzkGhHDY

The Configuration Process

The goals we talked about above will be achieved by creating custom configuration files from scratch. I go through how to install the Samba server and create your own custom Samba configuration files step by step in he video, but here are some sample files to show you what needs to be in them. Use this as a quick reference when you’re configuring your own network:

Contents of /etc/samba/smb.conf

[global] 
        server string = Dell Desktop 
        workgroup = WORKGROUP 
        security = user 
        map to guest = Bad User 
        name resolve order = bcast hosts wins 
        include = /etc/samba/smbshared.conf

Contents of /etc/samba/smbshared.conf

[Joe's Music] 
       force user = joe 
       path = /home/joe/Music 
       writable = no 
       public = yes 

[Network File Server] 
       force user = joe 
       path = /home/joe/Public
       writable = yes 
       public = yes

 

You can change the share names and the name of the Workgroup, of course. It’s all in the video.

One Last Thing…

You may get a message asking whether you want Samba to automatically configure itself when you install an updated version of the Samba server. Don’t! Just say not to make sure your custom config files aren’t overwritten. It’s a good idea to keep a backup copy of your configs, as well. I keep a document file with the text I always share and cut and paste when I need to re-install Samba after a change of distro or an upgrade to the OS on the machine running the server. It takes two minutes at the most and then I’m back in business.

Good luck and don’t have too much fun playing with Linux file sharing!

WiFi Without Network Manager Frippery

interfaces

Back in my day, sonny…there was a time when you could make your networking work without the network manager applet. Not that I’m saying the NetworkManager program is bad, because it actually has been getting better. But the fact of the matter is that I’m a networking guy and a server guy, so I need keep my config-file wits sharp. So take out your pocket knife and let’s start to whittle.

Begin by learning and making some notes about your interfaces before you start to turn off NetworkManager. You’ll need to write down these 3 things:

1) Your SSID and passphrase.
2) The names of your Ethernet and radio devices. They might look like wlan0, wifi0, eth0 or enp2p1.
3) Your gateway IP address.

Next, we’ll start to monkey around in the command line… I’ll do this with Ubuntu in mind.

So, let’s list our interfaces:

$ ip a show

Note the default Ethernet and wifi interfaces:
ip-a-show

It looks like our Ethernet port is eth0. Our WiFi radio is wlan0. Want to make this briefer?

$ ip a show | awk  '/^[0-9]: /{print $2}'

The output of this command will look something like this:

lo:
eth0:
wlan0:

Your gateway IP address is found with:

route -n

It provides access to destination 0.0.0.0 (everything). In the below image it is 192.168.0.1, which is perfectly nominal.

route-n
Let’s do a bit of easy configuration in our /etc/networking/interfaces file. The format of this file is not difficult to put together from the man page, but really, you should search for examples first.
interfaces
Plug in your Ethernet port.

Basically, we’re just adding DHCP entries for our interfaces. Above you’ll see a route to another network that appears when I get a DHCP lease on my Ethernet port. Next, add this:

 

auto lo
iface lo inet loopback
auto eth0
iface eth0 inet dhcp
auto wlan0
iface wlan0 inet dhcp

 

To be honest, that’s probably all you will ever need. Next, enable and start the networking service:

sudo update-rc.d networking enable

 

sudo /etc/init.d/networking start

Let’s make sure this works, by resetting the port with these commands:

sudo ifdown eth0

 

sudo ip a flush eth0

 

sudo ifup eth0

This downs the interface, flushes the address assignment to it, and then brings it up. Test it out by pinging your gateway IP: ping 192.168.0.1. If you don’t get a response, your interface is not connected or your made a typo.

Let’s “do some WiFi” next! We want to make an /etc/wpa_supplicant.conf file. Consider mine:

network={
ssid="CenturyLink7851"
scan_ssid=1
key_mgmt=WPA-PSK
psk="4f-------------ac"
}

Now we can reset the WiFi interface and put this to work:

sudo ifdown wlan0

 

sudo ip a flush wlan0

 

sudo ifup wlan0

 

sudo wpa_supplicant -Dnl80211 -c /root/wpa_supplicant.conf -iwlan0 -B

 

sudo dhclient wlan0

That should do it. Use a ping to find out, and do it explicitly from wlan0, so it gets it’s address first:

 

$ ip a show wlan0 | grep "inet"

192.168.0.45

$ ping -I 192.168.0.45 192.168.0.1

Presumably dhclient updated your /etc/resolv.conf, so you can also do a:

ping -I 192.168.0.45 www.yahoo.com

Well guess what – you’re now running without NetworkManager!

Why Open Source Software?

Fox Character Shruggin at the Reader

Hi, my name is Albert Westra. Back when I was around 12 years old, my best friend introduced me to the operating system we all know and love (and grieve for) Mandrake. Back then, I wasn’t really all that interested in open source software. Heck, all I really cared about Mandrake was the fact that I could change the wallpaper on every single virtual desktop, plus mess around with how my desktop looked and felt. Additionally, I was completely overwhelmed by the amount of disks that were required to install it.

A few months later I ended up erasing my parents IBM tower by accident, and that mistake began my curiosity with computers. It wasn’t until I was 15 that I built my first computer, and became interested in computer graphics. It was during this period of time that I was introduced to the wonderful world of Adobe software, specifically Photoshop. I was very intrigued by what I could do with it, but I wasn’t really willing to pony up the money for it after the trial period was up.Then my best-friend showed me Ubuntu 6.10 which had a piece of software pre-installed known as (the) GIMP.

Over the next three to four years I bounced between Gimp/Ubuntu and Photoshop/Windows, mainly because each title had something the other couldn’t do. Fast forward to today, you wouldn’t see me using Windows or Photoshop even if my life depended on it. Especially on Windows 10.

So why go open source? Some chose it because they value their privacy above all else. Especially after the public revelations of a certain NSA whistleblower, and a recent website hack conundrum that has left many a men disappointed, and couch ridden. For others, it is and ever will be a matter of principle. Software to them is a form of free speech which should remain in the open and stay in the open. For me and probably a lot of others switching to open source, the big initial reason was ultimately the cost.

Granted, just because a project is open source and free to download, doesn’t mean it’s free as in beer. It still requires developers to write the code, and when it comes to graphically focused projects, a lot of math. A “shit ton” of math. Enough math to make someone have a hangover from all the math they did the night before. Like the hangover you’d get if you had a few too many and woke up the next morning newly married to a member of a biker gang. And yet… these writers of code make it freely available on the inter-webs for people like you and I to use to our heart’s content. While my initial reasons for using open source software was because I had a big gaping hole in my wallet, it doesn’t mean I don’t own them anything. In fact, I owe these developers a great deal. If it wasn’t for them, I probably would have given up on drawing and pursued something else. I do give back when I can, but it will never amount to what I truly owe them.

Now my next reason for using open source software does kinda flow with the first one, but it’s a little more of an incentive reason. I may not be give as much as I wish I could to projects, however I’m able to focus where that money goes into a particular project. Take the Krita project for example. So far they have launched two Kickstarter campaigns and both have successfully been funded.  What made their Kickstarter work was not only the swag (though we do like swag), but that those who contributed money got to choose the direction the development would go. Granted that list was predetermined before the Kickstarter campaign began, but it was features that had already been requested by the community of Krita. Ten years ago, it was pretty hard to fund the projects you wanted to help, but today you are only 2 to 5 clicks from sending money anywhere in the world. It can also be said that you could use the same money you would have spent on proprietary software to learn a bit of code and fix a bug or add a new feature to the software of your choice. So while most open source software does not cost anything upfront, it does give you  the opportunity to give back financially or even with your personal time.

The biggest reason why I “go open source” and use it daily is ultimately the community. Call me a sucker, but there are some really amazing people in the open source community, and it’s not just the Developers. It’s the writers, artists, engineers, scientists, and others who use the software on a daily basis. If it wasn’t for open source software, we probably wouldn’t be as technologically advanced and knowledgeable as we are now. In fact, the majority of the Internet we use as a communication tool, runs on one of the biggest code collaborations in the world. This collaboration is know as the Linux Kernel. Obviously if it wasn’t for this particular community, we probably wouldn’t be having this conversation right now. Another example of this is the MyPaint community which I’m a part of. I not only enjoy reading about what is going on in the code, but I also enjoy talking to some of the people in the community about it. Plus every once in a while I get to submit a bug fix. Quite simply, I like the interaction.

Is the open source community perfect? I would like it to be perfect, but it is far from it. Just imagine, the person who is the overseer of the Linux Kernel is not a benevolent angel. He can be or is for the lack of a better word a dick. Especially if you yourself have experienced his wrath or been associated with a company that he has flipped off. Now, I’m far from perfect and have made a few mistakes which have affected those around me. But despite our individual challenges and the behavior of a few, this chaos of people from around the world manages to get along and create amazing projects. So all-in-all, it is a beautiful mess. My reasons for going Open Source are probably different than your reasons, and that’s okay. We are people who can not only think as a group, but are individuals as well and we will have different opinions on any subject. One thing is for sure, when we get together and collaborate, we can achieve great things, even if it’s just on the software level.

We have great open source software that can rival its proprietary counterparts, however, don’t forget about the people behind the scenes. One of my main goals here on Freedom Penguin, besides software reviews and technical tutorials, is that I want to interview the people behind the code. Though not just the developers, but also its community. If any of you readers out there have someone that you want interviewed, be sure to contact one of us here at Freedom Penguin. Most of my articles will be focused towards the artist community, because it’s what I love to do. However don’t be afraid to ask me any questions. If I don’t know the answer, we’ll figure it out together. I’m honored to be part of the community Matt has assembled and I hope you will not only read, but enjoy our content.

Web Server File Permissions Mystery Solved

Ever wonder if this whole Linux thing is actually unholy devil worship? If you’ve ever worked on web server file permissions, you might think so. They take the right amount of searching and just short of not too much coffee for solving when a problem presents itself. (Already, I’ve probably had too much coffee, as I just caught myself bobbing up and down.) Continue reading to find out about a strange problem that might happen to you as well. I’d been watching a problem occur on a web server for about a month, where a file from a customer registration becomes unusable because the file permissions are wrong. Not just a little wrong either, bonkers wrong:

 

unable to open file 
/var/tmpcgi/registration.txt at /usr/local/bin/updater.pl line 33.
total 12
drwxr-xr-x. 34 wheel       wheel         4096 Aug 17 10:24 ..
--w--wx-wT.  1 wwwuser     wwwuser         30 Aug 27 05:51 registration.txt
drwxrwsr-x.  2 wheel       wheel         4096 Aug 27 05:51 .

What makes an error like that? All my scripts were setting wide open permissions on files for this process. (I know: tsk, tsk.) Despite this, the problem bugged me for weeks. I created a script just to find the oddball permissions. Actually that wasn’t a great solution, because my find command was even wrong. What could be wrong with:

find -type f -perm -220

Well, for starters it didn’t do what I wanted. So, this morning I finally searched for what creates files with “–w–wx-wT” and stumbled across something helpful. I found forum posts chastising a user for creating bug reports about his own ineptitude for using chmod “666” and not the octal chmod 0666.

Unlike that user, I *know* that I don’t want to use string 666, but it did give me something to search for:

$ grep -r 666

 

Now, how often do you search for the devil in the details? I wasn’t using a string, but it was still wrong. In perl, saying chmod 666, $filename; is just as bad. The 666 is decimal. Devil horns, that won’t work! Use chmod 0666, $filename; and you escape hell. Not to heaven, but to octal.

Later, I found something in my search results that you geeks will probably find useful; There is a table of unholy permissions low down in the man page for Stat::IsMode perl module. I recommend putting this in your hat for future reference!

 

Recommended Linux Distro

On August 10th, 2015 Jory asked…

I’ve taken on Linux a few times, tried everything from Ubuntu to Mint, even tried setting up and running Gentoo on my own. In the end however, I always default back to Windows. I don’t know if it’s because I’m an avid gamer and installing games on Linux can be a pain or if it’s because researching problems I run into becomes overwhelming or what. But after reading a bunch of your articles on Datamation, especially the W10 Vs Linux one, I’m tempted to try it again….

My question would be, what do you recommend for distros? I’m a rather tech savvy person myself, I was a computer technician and an internet technician for 4 years, although I understand that hardware/software and fixing windows is a lot different than Linux and I think that’s where my problem lies. I’m all ears and looking for more advice from someone who seems to have an amazing understanding of the systems.

A new fan,

-Jory


Hi Jory,

I totally understand your frustration with trying to make the switch to Linux, while maintaining your sanity as well. Most of the time Linux issues receive a heavy-handed response when the same sort of issue with Windows is usually “allowed” by the masses. I know, I used to do this myself way back when.

With any luck, the following recommendations will help you along your way. First off, allow me to recommend my favorite Linux distribution – Ubuntu MATE.

Why Ubuntu MATE?

My main PC’s Linux distro is Ubuntu MATE 14.04. I found it to be stable to use and highly customizable. I also like the direction the distro is heading in with regard to adding new features. For example, the Ubuntu MATE 15.10 release will provide a new tool called Ubuntu MATE Welcome. This tool will provide a solid starting point where the new user can get needed applications, find support, and even have a place where they can get involved with the project. This is the distribution I recommend hands down.

Now the next consideration when trying out Linux is determining the compatibility of the hardware you’re running. While most things work just fine out of the box, every once in awhile you may have issues with video/audio/wifi. Below, I’ll share some tips on how to tackle these challenges by providing links to support forums with the information they need to be of assistance.

Getting help with potential hardware issues

Before I jump into this, remember the following: When asking for help, never post without being explicit about exactly what you did to get to the error and what hardware you have. Usually, the worst example is “my _____ doesn’t work.” Folks in the forums need to know what sort of hardware you’re dealing with. So if the issue is video related, then video card (model, brand) details are critical. Same applies for audio and wifi issues.

How do you know what hardware you’re running? From a command prompt (terminal), use the following tools to help determine what your hardware is. The following assumes you’re running an Ubuntu based distribution such as Ubuntu MATE.

Network card (even wifi):

sudo lshw -C network

Video card:

sudo lshw -C video

Motherboard:

sudo dmidecode -t 2

Various USB devices:

sudo lsusb

Note that with USB, sometimes the resulting names listed may seem different than the brand you have in front of you. For example, sometimes running lsusb only provides details that don’t make any sense. In the past, I’ve seen Ingram and Jing-Mold Enterprise Co., Ltd listed. Neither of these lsusb results mean anything to me.

However if I run this command instead:

sudo lsusb -v | grep -E '\<(Bus|iProduct|bDeviceClass|bDeviceProtocol)' 2>/dev/null

I’m presented with “iProduct” names: “Ingam” represents my USB Gaming Mouse and “Jing-Mold Enterprise Co., Ltd” represents my USB keyboard. This information is helpful as it identifies which items are hubs, keyboards and mice.

If using the command line isn’t for you, then you can install a program called CPU-G if you have a working Linux desktop environment on the PC in question. For Windows refugees, this will feel very familiar. It provides you with your CPU, Motherboard, RAM, and system details.

To install CPU-G, you’ll need to add the software repository so it can be installed and Ubuntu updates will keep it current with new releases.

In a terminal, paste in:

sudo add-apt-repository ppa:cpug-devs/ppa

(hit enter key, then paste)

sudo apt-get update && sudo apt-get install cpu-g -y

Once completed, CPU-G appears under Applications>System Tools.

Discovering and installing software

The next and perhaps biggest consideration for a new Linux user is software discovery. For Ubuntu MATE users, I recommend installing AppGrid. Some Linux users aren’t fans of it since its source code isn’t open source, however it’s by far easier to use than the alternatives I’ve tried in the past. It provides you with a visual source of software discovery, regardless of it’s software license.

Obviously, there are alternatives if the closed source nature rubs you the wrong way. You’re also welcome to install the Ubuntu Software Center. On Ubuntu MATE, if it’s not already installed, you can get it installed by pasting this into the terminal:

sudo apt-get install software-center

Starting with Ubuntu MATE 15.10, you’ll also have access to a tool called Ubuntu MATE Welcome (mentioned previously) which will help provide a solid launching point for applications most people might consider critical.

Bringing it all home

So Jory, that is a ton of information I dropped into your lap. And while it might seem overwhelming at first, it really breaks down into the following:

  • The importance of providing the correct hardware information.
  • Tools to discover and install software.
  • Which Linux distribution I recommend.

If you follow the advice above, I have no doubt that you’ll have a good time diving into Linux on the desktop and throughly enjoy the experience.

Before taking the next step and installing Linux onto your hard drive, remember this: run a live installation on a USB key first, since it won’t touch your hard drive. Test out playing audio, video and wireless networking. If you’re happy with the results, then you can look into installing it along side Windows so you don’t have to give up any games.

Do you have Linux questions you’d like Matt to help with? Hit the link here and perhaps you too, can Just Ask Matt!

Building My Own Ubuntu

Builder

You probably don’t need to build your own distribution of Linux. There are already so many to choose from and with a bit of research, I bet you can find one all prepackaged and ready to download that will do what you want it to. Right? Then again, why shouldn’t you? Even well seasoned Linux users tend to forget just how modular Desktop Linux really is. People also tend to think that the only folks who can put together their own distros are natural-born hackers who live in a terminal all day. Not true at all! Anyone who has a good basic understanding of what a Linux Desktop is made of and can execute a few simple commands in the right order can build their own Ubuntu based Linux Desktop. It’s easier than you think and it can be a great learning experience too.

Even if your build doesn’t turn out to be what you needed, you’ll have a better understanding of Linux. Plus a deeper respect for the folks who develop and maintain the popular ready-to-install distros we have to choose from today. Yeah, you really ought to build your own Linux distro after all.

Let’s start with a few principles first, shall we?

What we call a Linux Desktop Operating System is actually just a set of programs that work together to give us the illusion of a cohesive computing experience. The name Linux refers to the kernel only. The same Linux kernel can run a TV, router, smartphone or even your refrigerator. To understand this a bit better, let’s take a moment or two and go through what actually happens when you boot up your computer. I’m going to way oversimplify this; there are actually thousands of steps in the process, but we’ll only talk about the big ones.

Step one is to load the Kernel and the bootloader, which is what gets that accomplished when you first turn on your computer. The GRUB (Grand Unified Bootloader) is pretty standard for Linux these days. It lets you choose between different OS’s that might be installed on your machine and it also has some options to boot into special kernel modes for troubleshooting issues or running diagnostics. Most Ubuntu users don’t even see GRUB unless they are running a dual boot. The bootloader starts the process that initializes the system and loads the kernel. Now we can talk to the computer, but with just a kernel we can’t do much.

Next we need a shell. You’re familiar with the Bash shell that runs in a terminal emulator on your desktop? Same thing here. It has a few built-in commands but is really only useful if it’s bundled with utilities and programs that are designed to work with it to get the actual work done. The shell plus these utilities from what is known as a base system. Most server implementations of Linux are nothing more than a base system with a few server type applications installed. Servers are just robots that run in a rack somewhere sharing files or managing databases, they have very little interaction with their masters and don’t need any kind of GUI (Graphic User Interface.) Boring as hell, if you ask me.

 

To make our Linux system a bit more exciting and useful, we’ll need to add some more stuff to the system. We’ll need a Display Server to actually put something other than text on the screen and let us use touchpads and mice to interact with our computer. Xorg has been the standard display server for decades and most Linux heads just call it ‘X.’ There are newer display servers on the horizon that premise more speed and functionality like Mir and Wayland. What does the display server do? Basically, it draws pictures that other applications tell it to on your screen. It is what talks to your graphics card and tells it what to do. Fortunately, X is pretty much setup automatically these days and the average user should never have to deal with any manual configuration. X won’t do anything by itself, though. We need more stuff to get anything useful out of it.

The next thing a modern desktop needs in the boot order is a Display Manager. If you log into a system with no X installed, you’ll be presented with a very simple login prompt and once you provide your user name and password it will dump you at a prompt and wait for you to type in a command. You could install Xorg and then install a Desktop Environment (DE) and skip the display manager entirely but that would mean that you’d have to start X and load the desktop manually every time you logged in. It would probably get really old after a while because most of us just wanna jump right into a GUI and start pointing and clicking away. The Display Manger takes care of the login stuff and it adds extra features like the ability to use more than one desktop environment and allows you to do fancy things like switch users.

Most Linux DE’s use their own DM’s: Ubuntu’s Unity uses LightDM, Gnome uses GDM, Linux Mint uses MDM, KDE uses KDM and so on. There are actually many to choose from and the main reason I’m taking the time to talk about them is because some desktops don’t have a default and you would have to choose one if you were doing your own install from scratch. For instance, XFCE doesn’t pull in a DM automatically when you install it and you may find yourself scratching your head, wondering what went wrong. Typing in the command ‘startx’ will get you XFCE, but you don’t wanna do that every time. Or, maybe you do! What if you’re working with a server and you don’t plan on running it in graphics mode all the time? Well, easy-peasy, just don’t install a display manager with XFCE. Done.

While we’re on the subject of Linux and DM’s, it’s interesting to note how it integrates into the shell. When you boot Linux without any GUI, you are dumped into what’s called a TTY. This is a virtual terminal. You’ll notice that it usually says that you are in ‘tty1’ There are several TTYs available at startup and you can move from TTY to TTY by pressing Alt and Ctrl keys and selecting the TTY by pressing the cosponsoring function key.

TTY1 is F1, TTY2 is F2, TTY3 is F3 and so on. How many TTY’s there are depends on the distribution and you can be logged in on several TTY’s at once.

The DM usually is assigned to TTY7 or TTY8 and the shell automatically switches to that one at startup once the DM is installed and auto-configured. If you’re using a Linux computer to read this article, you can try it right now. Don’t worry, you won’t lose your place. Try pressing Alt+Ctrl+F1 right now and you should go to a text login. To get back, press Alt+Ctrl+F7 or F8. See how it works? How cool is that?

The full super-simplified boot process looks like this: GRUB + Kernel + Shell + Desktop Manager + Desktop Environment.

This brings us to the Desktop Environment and we have many to choose from here. You could opt for a super simple Window Manager like OpenBox or install KDE’s latest Plasma Desktop Environment. Here’s where you need to do a bit of research because the waters get a bit murky once you’ve chosen your DE. First off, you gotta decide from where it is your desktop environment that packages will come. Ubuntu has most of them already in the standard repositories and installing them can be as simple as issuing a command to install one package that will pull in most of what you need, including Xorg and a Display Manager. Nice.

The only problem is that the DE’s in the repos are sometimes older versions and you may want the latest greatest in your custom install. This most likely means that you’ll have to add a PPA or two to your base system to get those packages. Whatever you decide, you need to do a bit of reading to determine the best path for you. A great place to start is the Desktop Project’s website. Most have pretty comprehensive instructions on how to get started from a base system. There are a lot of great blog posts and videos out there too. Just keep looking until you feel confident you can do it.

Getting a base install of Ubuntu is easy. You can do one of two things: use the Network Installer to build a base system by not installing any desktops when it asks you what flavor of Ubuntu you want. Or you can also get the Server version, install it and then use a tool called ‘tasksel’ to remove all the server components, leaving you with a base Ubuntu system. I think using the Network Installer is probably the best way to go but it does take some time and a good, solid Internet connection is a must.

To get it you must first go to Ubuntu.com and then go to the Downloads tab. Click on ‘Alternative Downloads’ in the menu. You’ll find all kinds of links to different versions of Ubuntu, but the network rattlers will be listed right up at the top. Chose your version and then choose the download that you want. You can pick different kernel versions here too. Once you’ve made your choice, you’ll be taken to an FTP menu that offers a bunch of files to download. The one you want is called ‘mini.iso.’ This is your installation media and all you have to is boot your machine from it and then follow the prompts.

When it asks you what software to install, the only thing you might consider is a print server. If you choose a listed version of Ubuntu from here, it will install a prepackaged spin, not just a simple desktop. It is important to keep in mind that you’ll need a wired connection for all of this to work.

Once you’re desktop selected a desktop and it is up and running, you’ll probably need to edit a file to get the Network Manager applet to control the network on your machine. This is also necessary if you need to activate wireless access. Here’s a good source of info on how you do this at Help.Ubuntu.com.

Now that your basic desktop is installed, the fun really begins because it won’t include all of the things you’re used to getting. Every app, every utility and even a web browser will have to be installed manually. You’ll most likely have to get your own icon themes, backgrounds and window decorations as well. You can make it as tricked out as you like or just install the basic stuff you need and leave it at that. The main thing is that it will be what you want, nothing more and nothing less. Now, that’s what I call freedom! Now, watch my video and see what I came up with. It might give you an idea or two.

OK, OK… I see some of you are chomping at the bit and can’t wait to try this, so here’s a command you can use on a fresh base Ubuntu install to get a very basic XFCE DE to play with:

sudo apt install xfce4 xfce4-goodies slim firefox software-center

To get all of the third party multimedia codecs and fonts, run this command before you restart:

sudo apt install ubuntu-restricted-extras

When the EULA comes up, use the tab key to select OK and then use the arrow keys to choose ‘yes.’ After you are all done, restart your system and start playing!

Have fun!