Oh no! Where's the JavaScript?
Your Web browser does not have JavaScript enabled or does not support JavaScript. Please enable JavaScript on your Web browser to properly view this Web site, or upgrade to a Web browser that does support JavaScript.
Not a member yet? Click here to register.
Forgot Password?


General information

A "robots.txt" text file is basically just a simple text file made with any text editor such as NotePad
It is located in the Websites root directory that specifies for the Search Engine's Crawlers and Spiders or Bots what website pages and files you want or do not want them to crawl or index. Usually website owners strive to be noticed by search engines, but there are cases when it's not needed or wanted. For instance, if you store sensitive data or you want to save bandwidth by not indexing heavy pages with images

When a crawler accesses a website, it requests for a file named "/robots.txt". If such a file is found, the crawler checks it for the website indexation instructions

NOTE: there can be only one robots.txt file for the website. Robots.txt file for addon domains or sub-domains need to be placed in the corresponding document root

How to create a "robots.txt" file

The robots txt file is created in your website's root folder "yourwebsite.com/robot.txt"
You can use any text editor to make or edit a robots text file

The basic syntax for the robots txt file

>> User-agent: [The name of the robot for which you are writing these rules]
>> Disallow: [page, folder or path where you want to hide]
>> Allow: [page, folder or path where you want to unhide]

Example 1

If you want to allow crawl everything, then use this code (All Search Engines)
>> User-agent: *
>> Disallow:

Example 2

If you want to Disallow to crawl everything (All search Engines)
>> User-agent: *
>> Disallow: /

Example 3

If you want to Disallow for the specific folder (All search Engines)
>> User-agent: *
>> Disallow: /folder name/

Example 4

If you want to Disallow for the specific file (All search Engines)
>> User-agent: *
>> Disallow: /filename.html

Example 5

If you want to Disallow for a folder but allow the crawling of one file in that folder (All search Engine)
>> User-agent: *
>> Disallow: /folderxyz/
>> Allow: /folderxyz/anyfile.html

Example 6

Allow only one specific robot access in website
>> User-agent: *
>> Disallow: /
>> User-agent: Googlebot
>> Disallow:

Example 7

To exclude a single robot
>> User-agent: BadBotName
>> Disallow: /

Example 8

If you want to allow for the sitemap file crawling
>> User-agent: *
>> Sitemap: http://www.yourdomain.com/sitemap.xml

Example 9

PHP-Fusion Robots.txt default file
>> User-agent: *
>> Disallow: /config.php
>> Disallow: /administration/
>> Disallow: /includes/
>> Disallow: /locale/
>> Disallow: /themes/
>> Disallow: /print.php

Tip – Do not disallow files in the robots txt file that you want Bots to crawl or especially to hide, By doing this you are telling everyone about those files, We would recommend putting them inside a folder and Hide that folder
Other common mistakes are typos - misspelled directories, user-agents, missing colons after User-agent and Disallow, etc. When your robots.txt files get more and more complicated, and it's easy for an error to slip in, there are some validation tools that come in handy: http://tool.motoricerca.info/robots-checker.phtml

If your account has been hacked or compromised in some way - don't worry, we may be able to help.
We can offer a Managed Security Clean Up Service - for more information see our website or contact Technical Support by our Ticket System.

Our websites are very important, plain and simple. Unfortunately, there are hackers who have a goal to destroy everything we’ve created, killing our search engine rankings and leaving us with nothing but a compromised site.

Many people don’t know how to protect themselves from this problem, so in order to help, here is a list of things that you should do to protect yourself, your website, and your information.

1. Update Software and Scripts
2. Secure Passwords
3. Keep Your own PC Clean
4. ALWAYS Keep Backups!

Update Software and Scripts

The number one way that hackers compromise websites is through out-dated software and scripts. If you have PHP-Fusion or any other software that hasn’t been updated with new a new version that is available, it’s time to check for an update!

Most software providers offer auto-updates allowing you to upgrade to the newest version without any technical knowledge.
Nearly all software updates released contain security patches that cover up a known security vulnerability.
It’s usually not the server that is exploited; it’s your software and your scripts.

Secure Passwords

Only very trusted people should have the password to your hosting account and/or content management system.
Make sure that your passwords won’t be easy to guess, and that they combine a variety of upper and lower-case letters, numbers, and even symbols.
Never use dictionary words.

Keep Your PC Clean

The Web is full of viruses including Trojan horses, spyware, and even Keyloggers.
These programs can be discreetly installed on your home PC without you even knowing, and will keep track of everything you do.
Keyloggers can record all the keys you press, even while typing passwords!

Make sure that you have a good anti-virus on your PC that actively monitors your activity while you are surfing the Web, as well as downloading files.
There are many good free anti-virus programs out there such as Microsoft Security Essentials for Windows and ClamXav for Macs, that keep updated with new virus definitions, and do a great job of protecting you and your computer.

ALWAYS Keep Backups!

If you have a copy of all the content on your site and your site is hacked, you can re-upload your content to its original form without a problem.
Make sure to backup databases as well, especially if you are using a content management system such as PHP-Fusion, all of your pages and posts will be saved within the database that it’s tied to!

There is a tool called “Backups” within the cPanel that allows you to backup your entire home account, or just certain databases.

If you keep these 4 key things in mind, you will not have much need for worry, and with backups taken, you’ll never have to start from scratch if something does go wrong!

If your account has been hacked or compromised in some way - don't worry, we may be able to help.
We offer a Managed Security Clean Up Service - Contact Technical Support with a Ticket.

Hosting accounts can be hacked for a variety of reasons. A hack can be for the personal achievement of the hacker, to host phishing content which will steal passwords or personal data, to spread viruses or malware, or to send spam e-mail through an account.
After being hacked, you will need to restore your website to a backup. Below you will find the steps for restoring a website from a backup.

The following section will show you how to improve security on your hosting account.

1, Backups
2, Cleaning and Prevention
3, Updating
4, Scanning
5, File Checks
6, Permissions
7, Passwords
8, Additional Resources


Backups are one of the best ways to recover from a hacked account. PHP-Fusion creates backups every 6 hours.
Backups should be available for at least the past 3-5 days. If your account has been recently hacked, you may be able to restore your site files with a backup from your cPanel >> Backups tool.

Backups are not guaranteed by PHP-Fusion, so we recommend that you keep a local copy of the latest clean files for your website -- you should also download and maintain copies of your database files as well.

If your account was hacked before the available backups from PHP-Fusion, then the cPanel or Site Manager backups will not retain clean versions of the content -- they will only have the cracked files.

You can create a manual cPanel backup through the cPanel >> Backups tool, which you can use later to restore through the same tool. If you need a copy of your database from backup, please do so from phpMyAdmin. If you need a database backup restored.

Remember that whether you are able to restore your site from a backup or not, you will want to continue reading and following the steps in this article below.
If your account was hacked there IS an issue or vulnerability in your account that you will need to fix.

We HIGHLY recommend your maintaining incremental backups of your account, stored off-server. You can create backups in cPanel >> Backups.

Cleaning and Prevention

Below are steps that you can take to restore your account security and prevent future possible compromise.
You will want to read this section very carefully and follow its directions.

The most common method that we see used to compromise a hosting accout is vulnerabilities in user scripts, especially populare scripts in content management systems.


First, ensure that all script you have installed are running the latest version.
Popular scripts are especially notorious for being hacked.
Since so many sites use them, they are constantly being searched for vulnerabilities by hackers.

Older versions of scripts will sometimes have security vulnerabilities that have been patched by a new release.

Any addons, plugins, modules, or themes for a script should ALSO be kept updated. Be sure that everything is the latest version and is secure from possible compromise.
ALWAYS do your research before installing a 3rd-party addon, script, theme, or module.

If anyone else has had issues with a specific addon, you can usually find information about it posted online.
Use your favorite search engine to look for "XYZ CMS module vulnerability" and if you find any results -- especially concerning the latest version -- DO NOT INSTALL IT to your account.

You can update your scripts easily if you originally installed them using the Softaculous tool in cPanel.
Softaculous will be able to update any script that is installed, but NOT your custom modules, themes, addons, or plugins.

Many later version of popular scripts will have easy update installers in their admin/backend areas.
You usually just need to log in and follow the prompts to successfully update your account's script.

If you cannot access the Admin or Backend for your script, or the update is not working, you can visit the website for your script to find more detailed support documentation for things like hacked installs and manual updates. I have included some links at the end of this document specific to hacked popular script installs that you may find useful.


The Second most common method for a hack is the use of malicious files on the computers that have account access.
Many types of virus/malware/adware will look for hosting accounts and password to send to attackers.

The second step in account cleaning and security is to scan ALL computers that you use to log in to your account via cPanel, FTP, e-mail, Site Manager, etc. for malware and viruses.

After a full virus scan, PHP-Fusion highly recommends funning the free version of Malwarebytes Anti-Malware [you can download Malwarebytes from http://www.malwarebytes.org/]. This is a great application for cleaning malicious malware and adware.

File Checks

Third, be sure to check EVERY FILE that you are hosting! If the attacker has left a vulnerable file on your account, they can likely use it to gain access to your account again in the future.

Look for files that do not belong, or that you did not upload. Download and view the source code for all your files to check for suspicious or hacked script injections.
Some hacks will insert malicious code at the very top or bottom or your legitimate files. This is why checking your files -- every single one -- is critical!


Fourth, be sure that all the files on your account have the correct permissions, and are not giving too much permission. Having too much permission on a file might pose security vulnerabilities.

You can set permissions using FTP or through cPanel >> File Manager. 777 or "full permissions" should NEVER be used for files and/or directories, even when specified by installation instructions. 755 provides plenty of permissions in the place of 777.
Directories should be set to 755 by default. PHP, HTML, and the majority of all web files should have 644 permissions [or the lowest that works for your website], and ANY files that contain MySQL database or other login credentials [configuration files, usually] should be set to 400 permissions so they are ONLY readable by the account owner and the server itself.


Fifth, and MOST IMPORTANT of any step, change ALL of your account passwords to HIGHLY SECURE PASSWORDS in order to cut off further attacker access.
This includes your main account [cPanel or Client Portal] password, all e-mail account passwords, and custom FTP user account passwords.
Without changing these, the attacker may not have full account access, but can still get into enough portions of your site to check for remaining vulnerabilities or to gather personal information until they ARE able to gain full access.

A large number of exploits are due to the use of weak passwords and are easily preventable.
Passwords should NEVER be based on common dictionary words as these are easily guessed or cracked by such means as a brute force attack [guessing until they get it right].
The cPanel system has an excellent password generator, or you can use online generators to create a highly secure password like: https://secure.pctools.com/guides/password

Be extremely careful with whom you trust your password to! Be sure that anyone who has access to your account also knows to always use secure scripts, and has a malware and virus free computer.
you should change your account passwords AFTER securing your computer, account files, and scripts because if a vulnerability remains in one of these places the exploit can continue to get your new password with each change.

Additional Resources

PHP-Fusion clients who use cPanel can get immediate help to resolve your hacked account.
Submit a ticket to our support team and request your account is reviewed so that we can clean your account fast.

Google’s Cleaning Your Site Guide: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=163634

Removing Malware From Your Site: http://knoll.google.com/k/riona-macnamara/removing-malware-from-your-site/2vl8me364idq/1#

StopBadware’s Information for Website Owners: http://www.stopbadware.org/home/webmasters


WordPress Hacked FAQ: http://codex.wordpress.org/FAQ_My_site_was_hacked

Reset Admin Password: http://codex.wordpress.org/Resetting_Your_Password

Hardening WordPress (to avoid future hacks): http://codex.wordpress.org/Hardening_WordPress


Joomla! Security Hacked Site Guide: http://docs.joomla.org/Security_Checklist_7

Your Joomla! Is Hacked. Now What? http://www.instantphp.com/news/37-tips-and-tricks/133-your-joomla-is-hacked-now-what.html

Joomla! Security http://docs.joomla.org/Security

Drupal Security Team: My Drupal was Hacked, Now What? http://drupal.org/node/213320

Securing Drupal: http://drupal.org/security/secure-configuration

Secure Shell (SSH), known as Secure Socket Shell, is a UNIX-based command interface and protocol for securely getting access to a remote computer.
It is widely used by network administrators to control Web and other kinds of servers remotely.

SSH is actually a suite of three utilities - slogin, ssh, and scp - that are secure versions of the earlier UNIX utilities, rlogin, rsh, and rcp. SSH commands are encrypted and secure in several ways. Both ends of the client/server connection are authenticated using a digital certificate, and passwords are protected by being encrypted.

SSH uses RSA public key cryptography for both connection and authentication. Encryption algorithms include Blowfish, DES, and IDEA. IDEA is the default.

The SSH feature is primarily for advanced users with a working knowledge of UNIX.

Tracert is a TCP/IP utility which allows the user to determine the route packets take to reach a particular host.
Trace route works by increasing the "time to live" value of each successive packet sent.

The first packet has a TTL value of one, the second two, and so on. When a packet passes through a host, the host decrements the TTL value by one and forwards the packet to the next host. When a packet with a TTL of one reaches a host, the host discards the packet and sends an ICMP time exceeded.

Clients are sometimes asked to perform this function to determine if there is a break in communications between themselves and a remote server such as our servers.

Trace route (tracert) works by sending a packet to an open UDP port on a destination machine.
For the initial three packets, trace route sets the TTL (see explanation of TTL) to 1 and releases the packet.
The packet then gets transferred to the first router (completing the first hop), and the TTL gets decremented by the router from 1 to 0.
The router then discards the packet and sends off an ICMP notification packet to the original host with the message that the TTL expired from the router.
This tells tracert what the first hop is and how long it takes to get there.
Traceroute repeats this, gradually incrementing the TTL until a path to the remote host is traced and it gets back an ICMP Port Unreachable message, indicating that the remote host has been reached.

Response times may vary dramatically because the packet is crossing long distances, other times the increases come from network congestion.

For Example:
C:> tracert www.linux.org
C:> tracert

will show:
Tracing route to www.linux.org []
over a maximum of 30 hops:

1 <10 ms <10 ms <10 ms mn-bldg-rtr-vlan200-3.gw.more.net []
2 <10 ms <10 ms <10 ms co-r12-01-atm0-0-10.mo.more.net []
3 <10 ms 10 ms <10 ms kc-r12-01-atm1-0-131.mo.more.net []
4 <10 ms 10 ms <10 ms bb2-g8-0.kscymo.swbell.net []
5 <10 ms 10 ms 10 ms sl-gw9-kc-2-0.sprintlink.net []
6 * * *
7 50 ms 61 ms 60 ms 198.ATM7-0.XR2.TOR2.ALTER.NET []
8 50 ms 60 ms 60 ms 194.ATM7-0.GW1.TOR2.ALTER.NET []
9 50 ms 70 ms 60 ms att2-gw.customer.alter.net []
10 61 ms 60 ms 60 ms pos5-0-0.hcap1-ott.bb.attcanada.ca []
11 60 ms 70 ms 70 ms
12 60 ms 81 ms 70 ms router.invlogic.com []
13 70 ms 70 ms 80 ms www.linux.org []
Trace complete.

Note the asterisks >How to use Traceroute
Traceroute can be accessed at a DOS or command prompt. An Internet connection must already be established.

Click on Start > Programs > DOS Prompt (Windows 95-98) or Command Prompt (NT). In a Windows 2000 or XP environment, click on Start > Run. Type command into the dialog box, then click OK.
In the resulting command line window, type tracert hostname, where hostname can be a domain name, a machine name or an IP address.
Press Enter.
For example:
C:> tracert www.emints.more.net

Mac OS X
Double-click the Hard Drive icon > Applications folder > Utilities folder > Network Utility program.
Select the Traceroute tab and enter the hostname, where hostname can be a domain name, a machine name or an IP address.
Press Enter.

Setting up a connection within PuTTY

Download PuTTY from, http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
Select putty.exe from the list.
Once you have it downloaded open it up.
The information you need to enter to connect to your account is the following
Hostname: yourdomain.com or yourIPAddress
Port: 22

Common Commands

In the below examples commands will be surrounded by quotes, please do not utilize the quotes when entering the commands.

Auto Complete Feature
The auto complete feature allows you to utilize tab when running commands. I will refer to this feature in below commands and when it is very useful.

This command is utilizing by simply typing pwd. It gives you the current working directory or the directory you are currently in.
When you first SSH to a server you are placed in your home directory which is located at /home/username/ There is an alias to get you to your home directory which is the “~” character. This command can be utilized from anywhere and will always give you your full working path.

This command is used to change your password. When you run the command “passwd” and it will prompt you for your current password and then a new password.

Simple Help With Commands

Keep in mind when utilizing any Linux command you may want further information on the command itself. There are multiple ways to get this information and I recommend the first method. The first method is simply run the command with the –help switch. For instance “passwd –help” will give you a lot of the common switches and uses for the command incase you forget how it is used. The second method is to look at the man page. You can get into a man page for a given command with the following “man passwd”. This will open an editor that allows you to scroll through the man or manual page that has a very indepth and detailed description of the commands. Usually it gives too much information and is unhelpful. Another resource is google, however it will generally just return a man page.

The next command that will be of great use is the “cd” command which stands for change directory. This command will allow you to move around in the server and get to the desired location. Its use is as follows “cd /path/to/change” where the /path/to/change/ is actually the place you are trying to get to on the server. For example if you wanted to get to the /home/username directory you’d use the command “cd /home/username” and it will then change the directory to that current directory. A quick pwd will show the results.

By using the command “cd /home/use” and pressing tab it should autocomplete to /home/username automatically. If it does not, that means there are probably more than 1 match for /home/use and if you press tab again it will list all the folders in home that match username such as user username used or any other foldernames that begin with use. This can be very handy when trying to get to a primary directory.

This command is the list command. This is one of the commands you will utilize most frequently. It does just as it sounds and lists the directories contents. When I run the ls command I always run “ls –alh”. The switch “a” stands for all which will list all files including those with a leading . meaning they are hidden files. The “l” switch lists the files one per line rather than a bunch horizontally. The “h” switch stands for human readable which will lists the file size in kilobytes and megabytes instead of bytes. For example instead of 4096 it will show 4.0K. The ls –alh command shows a lot of information. We will cover the permissions section in more depth below. The syntax of this command is “ls –alh” or “ls –alh /path/on/server” where you give it a folder/path you’d like listed out. I personally like to be in the folder and run the ls command.

The stat command can be used to gain more detail about a file. Generally there will be much more detail than you will ever need. The handy use for this is if you are unsure what permissions a file has set simply run the command on a file. Its syntax is as follows, “stat filename”. It will provide you with a lot of information about the file that may or may not be helpful.


Permissions are a broad topic, this is not comprehensive.

Dissecting the output of ‘ls –l’

Lots of data is presented with ‘ls –l’ – we’ll focus strictly on the owner/group and the impact that setting has on the permissions of the file. One of the drawbacks to using ls –l to determine the owner/group of a file is that it will only display the first 8 characters of the username and group name. If the names are longer than 8 characters, you must add –n to the options of ls (ls –ln) to see the userid and groupid in place of the username and group name.

UNIX File Permissions
Unix files have 9 “slots” to determine the permissions applied to that file, plus one extra on the left that tells you what type of file you’re looking at. Here’s the breakdown:

r = read
w = write
x = execute
The file-type bit will usually be either a “d” or a “-“ (there are exceptions which are beyond the scope of this document).

If the object is a directory, the following considerations should be remembered:

Execute permission is required on directories. (Think of “execute” on a directory as the ability to use it).
Removing the ‘read’ bit on a directory will prevent the associated entity from viewing the contents of a directory.
The ‘write’ bit on a directory signifies the ability to add files to a directory, and delete files from a directory. If a person has ‘write’ permission, they can delete any file in that directory, regardless of whether or not they own it, or have write permission to that specific file.

If the object is a file, the following considerations should be remembered:

Execute permission is required if it is a program.
If the file is a script read permission is also required (unlike a binary executable). Remember, scripts are read in, then compiled, then run, so it’s critical that the user have permission to execute AND read the script.
The write bit gives a user the ability to delete a file, rename a file, and modify a file.

Changing Permissions
chmod is the command used to modify permissions on a file or directory. The syntax is: ‘chmod xxx object’ (where xxx is the new permissions and object represents a file or directory). UNIX permissions are bit-masked to achieve the desired level of security. (Adding together the appropriate values achieves the final result). Here are the values used in UNIX permissions:

x (execute) = 1
w (write) = 2
r (read) = 4
When you assign a file a specific permission setting, you give it three numbers (four is possible, that again, that is beyond the scope of this document). The first number represents the bitmasked permissions for the owner, the second represents the group, and the third represents the permissions for everyone else. Total permissions are achieved by adding the values above together (a value of 5 would be read+execute, a value of 7 would be read+write+execute).

[fusion]% chmod 644 filename.txt

[fusion]% ls –l filename.txt

-rw-r–r– 1 root root 0 Oct 11 16:06 filename.txt

[fusion]% chmod 771 filename.txt

[fusion]% ls –l filename.txt

-rwxrwx–x 1 root root 0 Oct 11 16:06 filename.txt

The “chown” command is very similar to the chmod command. I want to note to take extra precaution before utilizing this command. Ownership is very important in Linux and if you mistakenly change the owner of a file you can cause it to not be accessible by anyone else. You utilize the chown command as follows, “chown username:groupname /home/USERDIRECTORY” where the username is what user you want to assign and groupname is the group you want to assign. There are certain directories such as .htpasswd and the public_html that have special ownership requirements and the group portion of it needs to be set to “nobody”. Please keep in mind anytime you are using the chown command always copy/paste the file names or utilize autocomplete to avoid any typos that may have server wide effects.

mv is the Move command. This will allow you to move files from one location to another. It’s use is as follows, “mv file /path/to/new/home/”. The mv command is also utilized to rename files. You rename a file by the following “mv filename newfilename” this will successfully rename the file “filename” to “newfilename”. When moving files be cautions of the files you are moving and where you are moving them to.

When moving files please always remember to have the trailing / if you are moving it into a directory so the system knows you are moving the file into that directory and not renaming it to that foldername.

cp is the Copy command. This will allow you to copy a file from one location to another while leaving the file in its original location. Copy is a command that is meant to be utilized on a single file. When I utilize copy I almost always utilize it with the –a switch. The syntax is as follows, “cp –a filename /path/to/copy/”. The “a” switch is short for utilizing the “dpR” which will recurse and copy all files in a directory and the directory itself which is the “R” part of the switch, the “dp” is used to preserve attributes, timestaps, and symbolic links. This will leave the timestamps and ownership the way they are before the copy. With the “a” switch you can successfully copy files and directories with no problems.

Find is a very powerful command that can be utilized to do what its name suggests, find. The basic syntax of this command is “find /where/to/look -switch filename”. This is the find command in its very basic form. There are many different switches to this command, I will only discuss three of them in this training. The first of which is iname and is used like so, “find . -iname example.php”. This command says to look from my current location which is denoted by the . the switch is iname which indicates I am searching for the name of the file case insensitive and the file name I am looking for is example.php. Utilizing the iname switch it will find any files named example.php, Example.php, EXAmple.php and so forth.

The next switch is simply the name switch and it does the same thing as iname except it is case sensitive. The last switch that may be very helpful is the mtime switch. This is the last modified time. It is used like so, “find . -mtime +3” which will look in the current directory denoted by the . and search for all files and folders that were modified more than 3 days ago. The last thing you can do to add to the find command is utilize a switch named exec. This executes a system command on the information returned from find. It is used like this, “find . -iname test.php -exec rm -rf {} ;” which will find all files named test.php and the execute the remove function and delete the files.

This is the remove command. It is another one of those commands that you need to be VERY careful when utilizing. You could very easily delete your entire account with a misplaced forward /. The remove command is utilized as follows, “rm filename”. This will not work on a directory, in order to remove a directory you need to utilize a different switch. “rm -rf directory” will remove the directory. You can also use the -rf switch for removing a simple file name. The “r” portion of the switch recurses into the directory and deletes all files and folders in that directory including the directory itself. The “f” switch stands for force which will remove all files ignoring any errors or warnings. Please be very very cautious once again when using this command.

The touch command will allow you to create an empty file with the name provided. This is very handy when needing to create php.ini or .htaccess files. The command is used like this, “touch filename” and then if you perform an ls in that directory you will see the newly created empty file. Please note if you create this file as root it will be owned by root and you will need to utilize a chown command in order to change the ownership accordingly.

The cat command will take a files contents and spit them out to the screen. The commands syntax is as follows, “cat filename”. This will display the entire file to the console window so be prepared to be overloaded with data if it is a large file. If you do not need to see the entire file a better command would be either head or tail which I will describe next.

The head command allows you to view the first x number of lines in a file. The default on the fusion servers are 10 lines so if you run the command “head filename” you will get the first 10 lines of that file displayed in the terminal window. You can give it a switch with a number such as “-20” and it will show that many files, in this case 20.

This command works exactly like head above. One very important feature of this command is the “-f” switch you can utilize. The “f” switch refers to follow and will allow you to follow all input into that file. This is extremely useful when you want to watch a mail log or an access log to see what happens when you send an e-mail or visit someones website. To get out of the follow you simply use the “ctrl + c” which will terminate the follow.

vi is an extremely powerful text editor provided in most linux installations. To use vi you simply type “vi filename” and it will then open the filename provided. If the file does not exist it will create it and open the file. vi is not completely straight forward and does not work how you would expect. With vi there are 2 different modes, command mode which is the mode you start in when first entering a file, and input mode which allows you to edit the file. To switch from command mode to insert/input mode you simply press the “i” key and then you will be allowed to edit the file. To get back to command mode simply press the escape key.

A few helpful commands to utilize within vi when in command mode are,

“:set nu” This command will allow you to show the line numbers in a file
"dd” This command will delete the current line
“5dd” This will delete the current line and the next 4 lines ( a total of 5 )
“yy” This will copy a line
“pp” This will paste the line.
“0 Shift+G” this will take you to the end of the file
“53 Shift+G” this will take you to the 53 line.
“/searchterm” this will search for the specified searchterm. So if you did /hello it would find the first instance of the word hello. If you press “n” after searching it will go to the next instance of the word you are searching for.
"Shift + ZZ” This will save and exit vi
“:q!” This command will quit vi without saving and the ! is to tell the system you really want to exit without saving.
“:wq” This command will write and quit, same as Shift + ZZ
There are numerous other commands but these out to be enough for now. Keep in mind if you are using vi the most frustrating thing is getting used to command mode and insert mode. Esc takes you from insert back to command and “i” will take you from command to insert.

Pico is another editor which is much easier to use and has most of the same functionality. It however does not even get another word about it in this training besides how to use it, “pico filename”.

A few commands to help with php questions. You can utilize “php -v” to display information about the php version currently running on the server. “php -m” is another handy dandy command that allows you to view all the installed php modules on the server.

To zip files into a .zip file you do the following. “zip nameofzipfile.zip folder” nameofzipfile.zip can be anything you want to name the zip file, however it does need to end in .zip. The folder can be a name of a file or folder. Using the zip command will leave the directory in place and make the .zip file. To unzip use the syntax “unzip zipfile.zip”. If the .zip file contains files that already exist it will prompt you if you want to replace the file, all files, none, or rename them.

gzip is similar to zip however it only works on single files. It does not gzip folders. To use gzip simply run the command like “gzip filename”. Using the gzip command will gzip the file and not leave the original file. You will then see a file named filename.gz which indicates it is gzipped. To unzip a gzipped file run the command “gunzip filename.gz”. When you gunzip if the file already exists it prompts you if you would like to overwrite the file. Also when you gunzip the .gz file will disappear and you’ll just be left with the file.

This is the compression command you will be utilizing most of the time as it is the norm in Linux. The command is very simply when you break it down and look at it but it may appear confusing at first. To create a tar file run the command “tar -czvf zipfile.tar.gz foldername” and this will zip up the foldername into a zip file named zipfile.tar.gz. The tar command keeps all of the files on the server and also places a copy of them in the tar.gz file. You’ll notice the .gz extension on the end as the file is actually gzipped.

The switches associated with the tar command are as follows, the “c” tells the system to create a tar file, the “z” tells it to also gzip the .tar file when it is finished, the “v” tells it to do the zip verbose and show you what it is zipping (very hand for a lot of files), and the “f” is once again force. So after the command runs if you left of the “z” switch you’d have a .tar file but as the “z” switch is there it moves the .tar into a .gz file. To unzip a tar.gz file you use the same command as zipping it except an “x” for extract instead of the “c”. So it would look as follows, “tar -xvzf filename.tar.gz”. When you run this command it will output what it is extracting as the “v” switch is present.

**Keep in mind with this command it WILL NOT prompt you to overwrite files it will simply do it. So PLEASE verify none of the files exist where you are extracting it so you do not overwrite important files. Also after extraction the .tar.gz file will still exist it does not magically disappear into the night.

This command can be utilized to find where certain applications are installed. It is used as follows, “whereis perl” and it will provide you a path to perl. This does not work for every installed application but it does work on most.

In an effort to help improve the security on the servers we encourage clients to follow some of our best practices.
Please keep your contact information current and up to date in cPanel. This is critical to receive important information and critical updates.
We also ask that you keep your site applications current and up to date with the latest versions when possible.
Doing these simple tasks will not only help you but it will help us as we continue to work to provide you with the best hosting experience possible.

As another reminder for the future, here are 9 security tips our admins highly recommend:

1. Make sure that you have up to date Spyware / Malware / Anti Virus protection on any computer that connects to the site via FTP and SSH. Run a scan on these machines and fix whatever issues arise.

2. Once the above step is done, Change all FTP user account passwords. Make sure the passwords you reset are secure. Use upper and lower case lettering and numbers.

3. Make sure that allow_url_include, fopen, and register_globals are set to “off” within any customized php.ini files you have within your account.. Also make sure you have included insecure functions within the disable_functions list. This only applies if you are running PHP applications within your account.

4. Update any applications you are running to the latest stable versions. Newer versions will contain security patches for known exploits within that application. This also applies to any 3rd party plugins you are running for these applications.

5. Search the internet for ways to further secure these applications. There are usually quite a few extra steps you can take.

6. Keep an eye on files within your account, pay attention to files that aren’t yours, recently modified files etc. These can be indications of malicious content. Remove any malicious content found.

7. Make frequent personal backups, and make sure that your backups are not infected with malicious code. That way you can easily restore files if you need to.

8. Check all administrative areas of your sites. Make sure they are all password protected. Sometimes hackers remove this protection which can lead to easy entry later.

9. Check your applications for new Administrative user accounts that hackers may have setup as back doors. Remove any and all suspicious user accounts.

A Distributed Denial of Service Attack (DDoS) occurs when multiple compromised systems flood the bandwidth or resources of a targeted system, usually one or more web servers.
These systems are compromised by attackers using a variety of methods.
Malware can carry DDoS attack mechanisms; one of the more well known examples of this was MyDoom. Its DoS mechanism was triggered on a specific date and time.

This type of DDoS involved hardcoding the target IP address prior to release of the malware and no further interaction was necessary to launch the attack. A system may also be compromised with a trojan, allowing the attacker to download a zombie agent (or the trojan may contain one).

Attackers can also break into systems using automated tools that exploit flaws in programs that listen for connections from remote hosts. This scenario primarily concerns systems acting as servers on the web.

There are different kinds of DDoS attacks, but in general they can be difficult to manage and determine which connections are legitimate and which ones are not. Often times a more complicated firewall is put in place to filter out a lot of connections.

Sometimes other measure such as interfacing with another web server that can handle more connections than Apache may be taken as well. Unfortunately, after all these measures have been taken, sometimes all that can be done is to wait it out.

Digital Millennium Copyright Act (DMCA) is a United States copyright law which implements two 1996 WIPO treaties.

It criminalizes production and dissemination of technology, devices, or services that are used to circumvent measures that control access to copyrighted works (commonly known as DRM) and criminalizes the act of circumventing an access control, even when there is no infringement of copyright itself.

DMCA also heightens the penalties for copyright infringement on the Internet. Passed on October 8, 1998 by a unanimous vote in the United States Senate and signed into law by President Bill Clinton on October 28, 1998, the DMCA amended title 17 of the U.S. Code to extend the reach of copyright, while limiting the liability of Online Providers from copyright infringement by their users. On May 22, 2001, the European Union passed the EU Copyright Directive or EUCD, similar in many ways to the DMCA.

The General Data Protection Regulation 2016/679 is a regulation in EU law on data protection and privacy for all individuals within the European Union and the European Economic Area. It also addresses the export of personal data outside the EU and EEA areas.

If you think your PHP-Fusion site has been hacked you should follow These steps

The PHP-Fusion team will do their best to help you and try to rectify why your site got hacked.