Tuesday, January 22, 2008

IT Certification Self-Help Portal

I found this technical self-help website, uCertify.com, very interesting and wanted to share it with some of the readers. The company has been online since 1999. They are offering PrepKits which are interactive software programs that help you learn, track your progress, identify areas for improvement and simulate the actual exam. I sampled a few of their demo quizzes, but I mainly focused on their database kits because I am thinking about an Oracle certification in the near-term. On my initial run through, I found the practice test questions to be relevant and comprehensive, not just some cheesy Q&A effort.

I downloaded their Oracle 10G kit and obtained a key for it. It was a quick download via VDSL and was extremely easy to setup. I did not have any problems obtaining a key from uCertify. For the premium version, I took a couple of its timed practice tests along with its timed final. I thought the final test was a bit more difficult than the practice tests which was probably by design. Also, I noticed there was a “learn” function for each test question, which was accompanied by a thorough explanation. Frankly speaking, I needed to use a “RE-learn” function on some of the questions. :) There was also a means to create your own tests, add your own questions, provide immediate feedback, tagging, print, review questions, and make notes online – this flexibility was a nice surprise.

One last thing, per their website, they say if you do not pass the certification on your first attempt, they will refund your money. Yes, I was looking for an asterisk after this statement and was pleasantly surprised not to find one.

Here is what you get with the sample version.

30 questions total (this includes the quiz questions)
30 diagnostic test questions
Create tests
22 study notes
Articles, HOWTO's, and study tips
Progress report

Here is an example of what you get with the premium version, which may vary from kit to kit.

301 test questions
3 full length practice tests
105 study notes
Create tests
Unlimited free upgrades for a period of one year from the date of purchase
24x7 technical services
100% money back guarantee
Articles, HOWTO's, and study tips
Provide discounts on all future purchases

The sample version did give me a decent idea of what the tests were all about. Their PrepKits are designed to help you certify on vendors such as Microsoft, Cisco, Oracle, Adobe and a few others.

By the way, this is NOT a paid post.

If you are interested in a free, fully functional certification kit of your choice, leave a helpful non-anonymous UNIX-related HOWTO comment here and I will choose a winner after about two weeks. This is a $10 to $100+ value depending on the selected kit.

Update from uCertify: Your readers can use our discount code given on your Blog and get 10% discount on the uCertify PrepKit of their choice. Please use the following Discount code: ESOHUB

Saturday, January 12, 2008

Split XML Records with Perl Script

A colleague, Mahlon Anderson, and I were thinking of ways of splitting up a fairly large XML file, which had approximately 27K records in it. I wanted to split this file into smaller ones, each having about 250-300 records, because my former web host service kept complaining about constant CPU quota overloads during uploads. A Perl based splitter script quickly came to mind.

With the web host service, I had plenty disk space and plenty bandwidth but limited CPU usage. Apparently, I didn’t notice that sticky point in the fine print while signing up for the service.

A different splitting implementation was later used as the permanent solution, but here is Mahlon's "quick and dirty" XML Perl splitter-- printed with his permission of course.

# vi split.pl
#!/usr/bin/perl
$file = @ARGV[0];

open(_FH, "< $file") or die "Unable to open file\n";

$count = 0;
$files_counter=1;
$max_records = 300;

while (_FH)
{
if($count == 0)
{
$filename = $file . "_part_" . $files_counter;
open(FH2, "> $filename") or die "Unable to open file: $filename\n";
$count++;
}

if (grep /<\/item>/, $_ )
{
$count++;
}

print FH2 $_;

if ($count == $max_records + 1)
{
$count = 0;
$files_counter++;
close(FH2);
}
}
:wq!

# ./split.pl bigxmlfile.xml

Thursday, January 10, 2008

FEATURE_ERROR_USER_OPEN_NO_ACCESS

As I do practically every morning along with millions of others around the world, I logged into my Yahoo.com email account to check mail. To my chagrin, I was unable to login, and to make matters worse, my user ID/password combination wasn't recognized by the system. My first thoughts were "CAPS LOCK," or forgotten (which I rarely do) password , or hacked, or TOS violations (on what basis??) or corrupted cookies – who the heck knows. Then I decided to have my password recovered to an alternate email address (gmail), but I was still darn sure I knew my password. For “you know what and grins,” I made an attempt to recover it. But my user ID wasn't even recognized by the system. Here is the message I received after entering my user ID and answering a question about ever using a credit card or not.

----
Sorry That You're Having Trouble Signing In

We know that not being able to sign in can be frustrating, so we'll try to make this as quick and easy as possible. To get started, enter your Yahoo! ID and let us know if you've ever used a credit card with Yahoo!.

FEATURE_ERROR_USER_OPEN_NO_ACCESS
----
I wasn't too sure what the aforementioned cryptic error message was all about but it didn't look promising.

So I decided to recreate the email account, maybe there was a simple glitch in the system. No joy because it's not available. Someone else is using it - Yes, it's ME!

After traversing Yahoo’s help pages for awhile, I finally found the customer care form and submitted my problem. I let them know the account was tied to my PAID Yahoo MyBlogLog Account (for meta data and stats) and I was NOT spamming or using the account for any illicit activities, so I asked them to please explain why my account has disappeared into the ether.

After all that, I tried logging in again but no joy.

So now I decided it was time to Google for the answer. Here is what I found from Yahoo's answers via the Google index.

“the solution for this problem is to go to the yahoo! India web page and try logging in through that ..........”

So I brought up the yahoo.co.in homepage and grudgingly logged in. Whoa! It worked.

Wednesday, January 09, 2008

Finding Open Files with lsof Command

When a file is in use by a process, it is possible to delete the file - OR at least it may appear that is the case. The filename is no longer visible via the ls command, but it is there until the process using it exits.

For example, let's say Sysadmin1 runs a sniffer process in the background to capture and save packets to a file. The capture file starts growing bigger over time. Instead of killing the process, he/she simply deletes the capture file, thinking this will recover the disk space. It doesn't. Believing everything is well, Sysadmin1 goes home.

Now Sysadmin2 shows up and notices the box is running out of disk space. Naturally, the admin wants to figure out what’s rapidly consuming disk space. The easiest way for the SysAd to locate the growing file is to use the lsof command.

Another instance the lsof would be helpful is when a filesystem refuses to unmount due to open files.

Here are a few practical examples of using the lsof command.

To list all the open files on the var filesystem:
# lsof +D /var

To list all open files in your current directory only:
# lsof +d .

To list all open Internet files:
# lsof -i

To list all files currently open by user joe:
# lsof -u joe

To list all files open by syslog-ng (this is a great quick way to find
logs!):
# lsof -c syslog-ng

To list all files open by pid:
# lsof -p PID

Note: There are additional parameters you can add to the command to narrow the listing to include or exclude types of files and much more!

# lsof -help

Post provided by Mary M. Chaddock

Monday, January 07, 2008

Setup Mail Client on UNIX-based System

In the last post, a mail server setup was demonstrated. This post will demonstrate the setup of a mail client. Again, the setup of mail client is fairly straightforward. Here is the run.

On the client side, ensure the /var/mail directory is present.
# ls -l /var/mail

If not, create it.
# cd /var
# mkdir mail

Now modify the client's /etc/vfstab file
# vi /etc/vfstab
esoft:/var/mail - /var/mail nfs - yes -
:wq!

Now mount /var/mail
# mount /var/mail

Verify the /var/mail is actually being shared from the mail server
# cd /var/mail
# df -k .

Setup Mail Server on UNIX-based System

The last post made me think about setting up a mail server on my eBay purchased UNIX-based box at home. As I recalled from a few years ago, this was a fairly straightforward task to accomplish. The task primarily dealt with ensuring the server’s /var partition was amply sized and its /var/mail directory exported. And lastly, it was a NFS server.

On the mail server:

Ensure attribute information is correct:
# cd /var
# ls -ld mail
drwxrwxrwt 4 root mail 512 Jan 5 22:38 mail

Modify the dfstab file
# cd /etc/dfs
# vi dfstab
share -F nfs -o rw -d "mailbox directory" /var/mail
:wq!
# shareall
# cd /etc/init.d
# ./nfs.server start

Verify /var/mail is shared.
# dfshares
RESOURCE SERVER ACCESS TRANSPORT
esoft:/var/mail esoft - -

Setup mail client

Saturday, January 05, 2008

User Unable to POP Incoming Mail via MS Outlook

Yesterday I received a phone call regarding a user having problems with his mail. The user was able to send email but unable to receive it via MS Outlook Client. Usually, I encounter mail related problems due to recent password changes. However, that wasn’t the case this time because the user was able to connect to the POP server and subsequently send mail. Then I asked whether the ownership of the user's mailbox was set incorrectly or not. It was incorrect.

Here is an example.

# cd /var/mail
# chown esoft:mail esoft
# ls -l /var/mail/esoft
-rw------- 1 esoft mail 110240 Jan 5 22:38 esoft

Friday, January 04, 2008

FTP Using a Shell Script

Manually transferring a file or files via FTP is a common and convenient method for moving data from one computer to another, especially if it’s a non-recurring event. But the reality is there are many times when an event is recurring and calls for immediate automation. This can be done by using a simple UNIX script file, which can then be executed via command line interface or added to the crontab. In the shell script, myftp.sh, example below, I'm FTP’ing binary type files (pictures) from a local computer to a remote while logging the activity.

Note: Some organizational policies may not allow login/password information in a script file.

# vi myftp.sh
#! /bin/sh
REMOTE='esoft'
USER='anyuser'
PASSWORD='myftp125'
FTPLOG='/tmp/ftplog'
date >> $FTPLOG

ftp -n $REMOTE <<_FTP>>$FTPLOG
quote USER $USER
quote PASS $PASSWORD
bin
cd /myraid/dailyjpgs
mput *.jpg
quit
_FTP
:wq!

Run via CLI
# ./myftp.sh

Add it to the crontab
# crontab -e


Alternate post: FTP Using One-Liner and Perl Script

Per commenter's request: (call login and password outside of script)

# cat > /export/ftp/.user.txt
anyuser
control+d

# cat > /export/ftp/.passwd.txt
myftp125
control+d

# vi myftp.sh
#! /bin/sh
REMOTE='esoft'
USER=`cat /export/ftp/.user.txt`
PASSWORD=`cat /export/ftp/.passwd.txt`
FTPLOG='/tmp/ftplog'
date >> $FTPLOG

ftp -n $REMOTE <<_FTP>>$FTPLOG
quote USER $USER
quote PASS $PASSWORD
bin cd /myraid/dailyjpgs
mput *.jpg
quit
_FTP
:wq!