A List Of Successful Internet Marketing Tips For Everyone

The following information will give you some effective and smart ideas on how to run a network marketing business successfully.

Read More

5 Ways to Raise the Confidence of Visitors to a Commercial Website

We will give you a number of recommendations, compliance with which will increase the confidence of the people to your commercial website.

Read More

Merry Christmas and Happy 2015 New Year Greetings

Merry Christmas and Happy 2015 New Year Greetings from Utah Web Design & Development Company – Vital Webmaster, LLC

Read More

A New Font for Programming Input

Company Font Bureau has developed a new font family Input for programming input, the most important of which for us is the font Input Mono.

Read More

Website Security – How to Audit & Secure Your Website Checklist

Recently, I have had experience that a few of the clients reported that our company’s website had a virus or some kind of malware. I was not aware of it, until I started digging into it and found out that a few files have been injected with some kind of malicious code that would either redirect users to other websites or collect users information. I hurried and took care of this issue by removing the malicious software and establishing the security policy for our website. Results were positive and I continue until today to observe security policies and practices for each website that I am working on.

When we create a website most of the times from my personal experiences most of us don’t think of the most important thing, it is its security. Probably because we don’t have much experience working on creating new websites that’s why we do not think about it, but now it’s time to start thinking about it. The most important thing is always ask yourself the question: “What would happen if …?”. If you always ask this question, then your website will always be protected by almost 100%.

The site security is an urgent task for many website owners today. The emergence of a huge number of resources such as “Hacking for Dummies”, even those Internet users who previously had no business to your site or did not know much about Internet in the past, are eager to try their gained knowledge and brag about it by hacking your site.
What to do to protect your website from hacking? Where to start?

I will try to provide a few steps you need to start taking in order to secure your website from hacking. Security policy should start from the safe use of development tools to build your site. But I will not go into the details of programming, and present a number of actions of Safety when working on site that is built on content management system (CMS). I will continue to emphasize that the most secure sites are the ones that are written by yourself, as a programmer, from scratch.

For a start I will list non-programmatic methods that I use to protect the site from hacking. Surely, you have not even heard about them, but maybe you just did not pay attention.

Here is the Website Security List:

Please consider these main “anti-hacking” actions to secure your site:

  • Do not use the services of programmers, amateurs, and use the scripts that are properly written. When testing your scripts on the local machine in debug mode, do not be lazy to fix any bugs in the code that you find.
  • Do not offer free downloads or sell scripts written by you as the resource for others: having your source code before the eyes of others can help the hackers to calculate the principle of how you have written all the rest of your scripts.
  • Make periodic partial or complete testing of the resource from different browsers (especially Internet Explorer, which has a number of bugs (errors), which is actually an “open door” for hackers). Put yourself in the place of a possible intruder and try to find vulnerabilities from all possible positions.
  • Use .htaccess file for your root directory of the site and regularly browse logs. As an example of how your .htaccess file should look like to protect your site from hackers see my sample code below:

# Use PHP5.3 Single php.ini as default
AddHandler application/x-httpd-php53s .php
##### RewriteEngine enabled – BEGIN
RewriteEngine On
##### RewriteEngine enabled – END

##### RewriteBase set – BEGIN
RewriteBase /
##### RewriteBase set – END

##### File execution order — BEGIN
DirectoryIndex index.php index.html
##### File execution order — END

##### No directory listings — BEGIN
IndexIgnore *
# For security reasons, Option followsymlinks cannot be overridden.
#Options +FollowSymLinks All -Indexes
# For security reasons, Option all cannot be overridden.
#Options +SymLinksIfOwnerMatch All -Indexes
Options SymLinksIfOwnerMatch ExecCGI Includes IncludesNOEXEC -Indexes
##### No directory listings — END

##### Rewrite rules to block out some common exploits — BEGIN
RewriteCond %{QUERY_STRING} proc/self/environ [OR] RewriteCond %{QUERY_STRING} mosConfig_[a-zA-Z_]{1,21}(=|\%3D) [OR] RewriteCond %{QUERY_STRING} base64_(en|de)code\(.*\) [OR] RewriteCond %{QUERY_STRING} (<|%3C).*script.*(>|%3E) [NC,OR] RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR] RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2})
RewriteRule .* index.php [F] ##### Rewrite rules to block out some common exploits — END

##### File injection protection — BEGIN
RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=http:// [OR] RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=(\.\.//?)+ [OR] RewriteCond %{QUERY_STRING} [a-zA-Z0-9_]=/([a-z0-9_.]//?)+ [NC] RewriteRule .* – [F] ##### File injection protection — END

## Disallow access to rogue PHP files throughout the site, unless they are explicitly allowed
RewriteCond %{REQUEST_FILENAME} (\.php)$
RewriteCond %{REQUEST_FILENAME} !(/index[23]?\.php)$
RewriteCond %{REQUEST_FILENAME} -f
#RewriteRule (.*\.php)$ – [F] ## Disallow access to htaccess.txt, php.ini and configuration.php-dist
RewriteRule ^(htaccess\.txt|configuration\.php-dist|php\.ini)$ – [F] ##### Advanced server protection — END

  • When the website engine is used such as CMS, watch for updates and install them in a timely manner. Do not use the demo version of the components, even if they have the appropriate functionality.
  • Use a reliable software:
    • The use of licensed software will ensure that no other person introduced “extra features” that are not needed to your site. Download distributions of web applications and extensions / plugins for CMS, widgets and libraries only from official sites or from trusted sources. Of course, the temptation to use the free, fully functional version of the paid version of the CMS is very large. But you need to understand two things:
    • First, it is often distributed in a network “broken” the engine through the efforts of hackers already have a built-in scripts that simplify hacking.
    • And secondly, even if the download CMS «clean”, it will most often be an older version, which is much easier to break – all of the vulnerability has long been known to hackers. And, of course, the lack of support from unlicensed versions also complicates management.
    • If a distribution is necessary to download a dubious site, be sure to check if it contains malicious code.
    • Carefully study the code of any additional components you want to add to CMS.
    • Update your CMS and server software on regular basis and follow the news about the vulnerabilities used by CMS.
    • Perform regular security audits of servers.
    • After installation, remove the CMS installation and debugging scripts.
  • The choice of hosting should be considered before launching your website. To believe that all hosting offers differ only in terms of disk space, supported languages and other general parameters is a big mistake for such an issue as security. And even though by law, the responsibility of the service provider does not include additional activities to ensure burglar measures, a minimum set of security tools from the host must be present and it is summarized as follows:
    • System directory (public_html, cgi, logs, etc.) should have limited access and is within the directory;
    • To make sure we do not put in the free review of restricted files not intended for outside world when adding files to the server any right to view them should automatically be limited;
    • The equipment must operate without host failures, outages and other factors that reduce the efficiency of the resource.
    • Consider using Linux-hosting, which in itself is incomparably more stable than Windows-based hosting.
  • Use complex passwords for web server software (FTP, SSH, administrative panel of the hosting and CMS).
    • Choose complex passwords. A complex password contains at least 11 characters and includes mixed-case letters, numbers, and special characters. Experience shows that even the most nimble software for simple brute force password guessing copes with a password of eight characters a little less than a year. The fact is that there are 2?1012 combinations of the password with 8 characters, and there are even more combinations of the password with 8 unidentified attacker characters.
    • Do not use the same password to access different services.
    • Even the most secure passwords should be changed every three months to insure that it is not accidently released to anybody.
    • Do not store important passwords in a web browser, file manager such as FTP-, SSH- client, and on any other unproven resources and anywhere electronically. If you need to store passwords, use the special password managers, if not rely on your memory. Password Manager is a special program that allows you to store and organize your passwords in an encrypted file. To access the password manager, a separate password or also known as a key is used. By the way, to remember one password is much easier than the dozens of different passwords. So, if you need to store your passwords, use the Password Manager.
  • Follow the security policies for your PCs used for business purposes. On all computers that are working with the server (the computers of the webmaster, administrator, content manager, sales manager, etc.) must be installed anti-virus software with support for regular updates. Also each computer use need to make time to update their operating system and software applications. There is special anti-virus software designed for installation on the hosting. These programs allow you to quickly identify the entry of unauthorized files on the site, to determine their harmfulness and be promptly removed.
  • You must use a reliable antivirus on the office computer, and if you suspect a virus is then it is better not to go to the admin panel of the site and hosting features until you complete the “recovery” of the original files.
  • Control data entered by users. Monitor user activity on the hosting or the admin panel. If you are the administrator of the resource, you must be aware of what other people or other moderators should have access to. Therefore, attempts to login to the admin panel and the more other management areas from unknown IP-addresses is often a signal the attempts to hijack the site. Most often, you can activate the monitoring of activity of the CMS by installing additional plug-ins or activating logging modules on your hosting.
    • Filter the HTML-code in user input fields, which can be built into the code pages.
    • When getting data from the user, check on the server, for example if their size, is transferred to a value in the list of permissible length.
    • Never insert from users of the data directly into calls to eval (), SQL-queries or conversion. Always check and clean the information received from the harmful elements before storing data.
    • Do not leave in the working version of the code parameters entered for debugging.
    • Use a WAF (Web Application Firewall).
  • Keep a “white list” of the authorized IP addresses from which authorized users can login to use your website’s resources.
  • Control user access rights, in particular, provide protection from cross-site request forgery (CSRF). Do not give permission to the admin panel of the site untested people. Otherwise, do not be surprised why the site has been hacked. Also, do not give the right to add HTML-code for everyone, because unscrupulous users can add to the site with malicious code. Restrict access to the administration panel and CMS database (eg, phpMyAdmin), as well as to the following resources:
    • to backup copies of the code;
    • write to a configuration file;
    • metadata version control systems (such as directories. Svn or. Git).
  • Protect against bots. To protect against robot-hackers you can use special plugins for CMS or you can find IP-addresses of the users in the blacklist online.
  • Here are a few things you can do to check the data that users can enter.
    • Do not give the possibility to insert JavaScript-code inside <script>, in tags or links.
    • Do not put directly on the pages of the site code in the tags <iframe>, <object>, <embed>, or file that is uploaded .jar, .swf and .pdf (with their help, the site can generate such tags automatically.)
    • Maintain a “white list” of allowed HTML-tags so you can without additional processing discard all the rest.
    • Check references or links inserted by users through the Safe Browsing API.
  • Be careful with the ads and third-party code you insert into your site (i.e. affiliate programs).
    • Plug into your site only those commercials that have been provided by a proven advertising system or a program.
    • Before connecting the site to the new affiliate system, look for reviews about it and examples of distributed content.
    • Avoid “unique offerings” (suspiciously high fees for counters and blocks, the monetization of mobile data traffic).
    • If possible embed on your pages static content (such as links and images). Avoid loadable <script> and <iframe>. Flash, Java and ActiveX-components are only accepted in the form of source code, which you can check and compile yourself.
    • Do not use affiliate programs with hidden elements.
    • If your site is static, some affiliate systems can request access to FTP, to independently change the banners. Providing such access is dangerous: if the database of an affiliate system is compromised, the attacker will have direct access to the files on your site.
  • Closely monitor the access to the service interface. Access to the site should have only those to whom access is necessary and as long as it is needed.
  • Revoke access to specialists, performing short-term jobs on your site, previous owners, people who are not responsible for the operation of the site (for example, marketing professionals or managers).
  • If you need to some strangers to work on your site, try to get some recommendations about them. After finishing necessary work disconnect their accounts or change passwords.
  • Change folder permissions (CHMOD) usually to no more 755 and for files to 644. This prevents unsafe scripts to be injected in your hosting.
  • Try to make a backup of your database and the content of your site folders at least 1 time per week.
  • Make sure that the site is free of bugs and errors. If any found, remove them as soon as possible so you dont allow hackers with an opportunity to find vulnerabilities on your site.
  • To ensure that your domain is not flooded add CAPTCHA on all forms, including the registration, comments, feedback, etc.
  • Make sure you find possible modules and components for your site after its creation to ensure the safety of your site and its data.
  • Before adding the file to the site materials, check with the Antivirus on your computer.
  • Make sure to check the server for the last modification date of folders or files. Typically this can be accomplished by checking files and folders creation date in the Control Panel with the file manager.
  • Unfortunately, when it comes to DDoS-attacks, the invulnerable sites do not exist. DDoS-attack is an attack that is produced with a large number of computers trying to connect to your website and the site begins to receive a lot of requests. The Server cannot process a large number of requests and the site can stop working. In addition, if the script is very complex, then to “freeze” the site can be done with a small number of requests.
  • If you don’t know or don’t understand the steps you need to take to secure the site then you need to seek the advice and help of an experienced administrator who will advise, install and set up properly secure operating system (eg, Linux or Mac), which is difficult to infect with viruses. Even on Linux or Mac machines I would suggest using licensed antivirus software.
  • Mask addresses access to the admin panel of the site. Most of the standard CMS addresses have require user login and password to manage the content. For example, to enter the admin area of WordPress it is almost always done by typing in the browser www.yourdomain.com/wp-admin.php. However, in any CMS almost always you are able to change the default login form access to the site, replacing it with a less obvious URL address.
  • Encrypt data on the site. This method is required if the resource contains data that should not be accessible to a wide audience. Hacking threat is always there, and for sites with sensitive information, it is even higher. Encryption complicate the extraction of valuable information from hackers stolen information, and give you time to take the necessary measures to eliminate the consequences of breaking.
  • Always check that the user entered into the form. To do this, use regular expressions.
  • Always pass incoming data through htmlspecialchars (), which replaces the dangerous characters to entities, except in cases where it is necessary to leave the HTML-tags.
  • Check all incoming data for accuracy, using string functions and / or regular expressions.
  • If the user entered a database query, this input should always be escaped using addslashes (). This function should be used only if the directive is disabled magic_quotes_gpc. If it is enabled, all incoming data is escaped automatically.
  • Ignore incoming data through functions such as stripslashes (), if used in a query to the database. Do not worry, that will fall to the base escapes. No, the data in the database will be the same as when they were sent in the form. Simply request itself will be safe.
  • Always check the scripts work on a variety of input data. Do not forget that if a user needs to enter their name, you will not want to enable them to enter any JS-code.
  • Always turn off the directive register_globals in your php.ini file (php_flag register_globals off). As practice shows, the vast majority of programmers do not initialize variables. I will write more about the importance of register_globals in the future. As for now, here is a simple example of the usage of register_globals:

=new mysqli(“localhost”,“root”,“”,“mydb”);
foreach($array as $key => $value){
->query(“DELETE FROM `my_table` WHERE `field`=’$value'”);
If you initialized the array so: $ array = array ();, then everything would be in order. However, I am sure that not all of you are doing it. As a result, the attacker goes to the following address: http://www.yourdomain.com/your_script_name.php?array [zero] = 0, and your script safely removes that record, which should not have been removed. And nothing would have happened if it had been that the directive register_globals was disabled.

  • Make sure your web host runs suphp. Under normal PHP, scripts run as “nobody,” your script has open access. With suPHP, access is limited to the user or to those explicitly granted permission. Not all hosts use suPHP, so make sure your host does and set up another potential roadblock for hackers.
  • Use SSL to send emails especially if, somewhere in any of your millions of untrashed emails, you’ve ever sent sensitive info via email.
  • Use SSL to access your control panel or any other site resources (i.e. FTPS for FTP file transfers).
  • Here is what you need to do if the site has been hacked:
  • identify and remove malicious code. If infected many files then restore the site from backup.
  • change passwords and access to super admin FTP.
  • If Google or any other provider had marked your website to be malicious, then write a letter to Google webmaster with a message that the site is safe for visitors, after you made sure that it is.
  • Enable cloud hosting if possible. With cloud hosting, your files are backed up off site in a safe place. In the event of failure of the equipment, you can simply insert a new hard drive to your server and start downloading your backup files to the new hardware.

Perhaps you will find safety a troublesome occupation, but do not forget that you and only you are responsible for keeping the passwords to access the site safe. Also, you must understand that even the use of all these tools do not give 100% guarantee of protection against hacking. Also remember that the probability of a hacker attack is directly proportional to the value of the information stored on the server. If you own a personal blog, these steps if followed help to forget about the Internet intruders. And, finally, you don’t have to be the one doing all the work. Hire someone who has experience and knows how to do it.

As a webmaster, I perform analysis of the site for malware and viruses and implement reliable protection from them. This is part of the Website Maintenance Services that I offer for my clients. Please visit Website Analysis Audit services page for more details. Please call  to schedule your free consultation or simply  Contact Me  by submitting your inquiry online.

Read More

SEO Tips on How to Make Crawlers Index Your Site More Often

Fast indexing in search engines is an important task for any webmaster for better SEO ranking (positioning on top of search engines). The main thing that attracts search engine robots is a new content. If the site is updated frequently, such as news portals, the problems with indexation should not arise if you have the resources, new pages and all of these are literally in minutes fall in the index. But not every website needs constant updating. If it is, for example, the company’s website, it can be filled once, and then only sometimes add news. And how to get the search engines to see them quickly? Reindex or updated pages? Consider a few ways to get to the site search bots.

How do you make crawlers quickly index the new pages of the site?

Regularly updated blocks

Virtually any resource page can be placed in the form of updated blocks, for example, with news, commentary, offline messaging or social networking. Online stores can display on each page the list of recommended products that will change randomly. Search engines, in each case will see a “new” page and get used to that it is constantly updated, and those search engines will feel that they need to go there more often.

The downside of this method is considered to be that there is a small leak page that provides more weight for the same forum or news page.

Visitor retention

If you allow users to post comments on articles, add comments about the goods, own articles and other content, it will constantly update the website. It is clear that the activity of visitors should be encouraged and supported, as well as there is a need to monitor the quality of the same comments and feedback to the site does not become a spam site.

Cons of this method is possible to reduce the Relevance page keyword density. The keyword density is reduced even on a perfectly optimized pages as there is a new non-optimized text, which dilutes the main content.

New links

If there is a need to attract search engine spiders to a new updated page, then it is necessary to get new links to that page (there may be a link to a new page from your site, but it is better to get new links from external resources). According to the robot, it hits the page and reindexes it. It is ideal to get links from news sites.

For Google System, to expedite indexing there is a very well-proven service such as twitter – though at the corral in the index using twitter, so much depends on the account from which there is a link.

The only negative fact is that you have to spend money on buying links or run the link on twitter accounts.


You can always just delete the old page and create a new one – with a new address and new content. In this case it is best to make a redirect from the old page to the new one not to lose all the weight and external links. It is clear that some parameters, such as age of the document will be updated as well.

The disadvantage of this method is that it is impractical, for example, if you want to reindex a large number of pages.

What methods are acceptable, available and most appropriate for the site – everyone decides for himself, because it all depends on the features and capabilities of the resource. But best of all, if the site is constantly updated with fresh content then the search engine spiders will continue indexing your pages as often as you update content.

Read More

How to Embed Flash Movie Video in Web Page?

For those who tried to find a solution for embedding any Flash (SWF) Movie or Video in your web page and have  not found a good solution, the source presented below have always worked for me. I combined my research into one simple solution presented below. It works for me and I hope it will work for you.

All you need to do is insert this code anywhere within the <body>…</body> tags.

<object classid=“clsid:D27CDB6EAE6D11cf96B8444553540000″
title=“Your Flash Movie Title”>
<param name=“movie”
    value=“http://www.yourdomain.com/your_flash_movie.swf” />
<param name=“quality”
    value=“high” />
<param name=“wmode”
    value=“opaque” />
<param name=“swfversion”
    value=“” />
<! Next object tag is for nonIE browsers. So hide it from IE using IECC. >
<![if !IE]>>
<object type=“application/xshockwaveflash”
    <param name=“quality”
value=“high” />
    <param name=“wmode”
value=“opaque” />
    <param name=“swfversion”
value=“” />
    <! The browser displays the following alternative content for users with Flash Player 6.0 and older. >

<h4>Content on this page requires a newer version of Adobe Flash Player.</h4>

<p><a href=“http://www.adobe.com/go/getflashplayer”

    target=“_blank”><img src=
    alt=“Get Adobe Flash player”
    height=“33” /></a></p>
    </div><![if !IE]>>
</object> <!<![endif]>

Read More

Simple Start‐up SEO Checklist to get to Top of Google for Free

If you don’t know where to start, here is a simple checklist the will get you started. More in detail is described in SEO Checklist and in Best Practices for Well Designed and Optimized Website sections.

This simple start‐up check list will definitely get you seen on Google and will help you to achieve top ranking on major Search Engines. After you went through the list and applied SEO tips to your website, continue monitoring and optimizing your website.

  • Determine the best keyword phrases for your site. Use WordTracker to help you come up with a list of 2 to 4‐word phrases. Target 100 keywords initially (For details on how to use WordTracker please read sub‐section “Choose Keywords”. For other tools available for Search Engine Optimization, please see “Free Search Engine Optimization Tools” section).
  • Create lots of pages. It’s better to have 20 short pages than 5 long pages on your site. Each page should be 200 words minimum and discuss one topic (one keyword phrase) only. (See examples of some web pages you may want to create for your own website in “Best Practices of a Well Designed and Optimized Website” section.)
  • Optimize each page for your keyword phrases (see “Positive Search Engine Optimization Factors Detailed Checklist” for more details):
  • Include keywords in the <TITLE> of each page. This is a MUST!
  • Replace “ands” with a “|” character
  • Include keywords in the <H1> and <H2> headings for each page. Use the styles sheet (CSS) to control the size of heading text to make it blend in better. Google considers <H2> tags more important than tags, since people were abusing tags
  • Include keywords in the first paragraph of each page (within first 20 words).
  • Use keywords in italic and/or bold tags in your main body. Example: <BODY><p>keyword</p>
  • Last paragraph of your page should also include keywords: <p>keyword</p></BODY>
  • Include keywords in the text of links. Never use “Click here”. Example: if my keyword is “x‐ray” use it in your URL <a href=”http://www.domain.com/x‐ray_products.html“></a>
  • Use keywords in your folders and file names. Example: http://www.domain.com/keyword/keyword‐products.html or in your image file keywords.jpg
  • Your images should include ALT tags with keywords in them. Example: <img src=”keword.jpg” ALT=”keyword”>
  • Link to each page from your sitemap page, and from each page back to your home page. Also cross‐link between pages that discuss the same topic.
  • Submit your site to the Open Directory Project (OPD: go to http://dmoz.org/add.html.). This is very important and should be done early as it takes time to get listed! (See “Submit Website to Open Directory” sub‐section for detailed overview of OPD)
  • Submit your site to Yahoo! If you can afford it, this is money well spent. Be sure and follow Yahoo’s submission guidelines precisely and to the letter.
  • Regularly monitor your progress and modify your efforts:
  • Monitor your site traffic often – it contains a wealth of information.
  • Check to see that all new pages are indexed in Google.
  • Check your site’s ranking on your chosen keywords once a month.
  • Regularly check your incoming links as part of your link campaign.

Read More