Step 1: Back up your data. Step 2: BACK UP YOUR DATA. Step 3: BACK UP YOUR FRACKING DATA (Never watched the new Battlestar Galactica this is an f-word replacement). You might lose your data. Read the procedure and process and RELEASE notes on Mariadb web site. link as of this publish date: https://mariadb.com/kb/en/upgrading-from-mariadb-104-to-mariadb-105/ There […]
Read More →Category: Scripting & Automation
Exam 2: Reality check RHCSA test question: cron question 20
19/20 We will get back to the autofs question. I want 300/300 on that test.
Read More →Exam 2: Reality check RHCSA test question: /etc/passwd question 19
18/19. 1 to go.
Read More →Exam 2: Reality check RHCSA test question partition/filesystem question 4
Create and mount a 100MiB files system under /meet as ext4 4 correct out of 4.
Read More →Search and create results
Search /etc/passwd for leah and save in /root/results Interesting results in the gecos field.
Read More →ntp integration
Integrate server1 with classroom nfs server For some reason I already set this up: Partially anyway I ran yum -y install chrony and systemctl enable chronyd.service
Read More →User needs cron
Few users are really safe with cron. But the RHCSA deities say we need this exercise so we are going to do the exercise. Lets say the user is leah and she needs to run a job the echo’s Shalom every day at 19:07. Looks good to me.
Read More →RHCSA Exam check and set graphical moe
Some testing victims are reporting the entire test is command line mode. This procedure may not apply there. systemctl get-default if the answer is anything else other than graphical.target above run the following command: systemctl set-default graphical.target
Read More →Maria-db(mysql) two node installation with galera data replication
This is the first step toward unsupported installation of the NDC, New Data Cloud. To get the commercial version with full support, click here. This document takes you through the installation and creation of a two node database installation. This install was done in google compute engine but with the right band width will work […]
Read More →Quick and dirty autofs script share
I run a large unix and linux server farm on the west coast. In the old days we had problems with scrip versions. The problem was we pushed scripts from a central server and inevitably due to network or space issues the updates did not happen reliably. In 2012 we opened up our unix management […]
Read More →