Advice Googlebot (Google) To Crawl Your Site   Googlebot is a Google’s web crawling bot or spider. This collects data from the web pages to build a searchable index for the Google search engine. Crawling is simply a process by which Googlebot visits new and updated pages, It uses an algorithmic programs determine which sites…

Control or Stop Search Engines to crawl your Website using Robots.txt Website owner can instruct search engines on which pages to crawl and index, They can use a robots.txt file to do so. A search engine robot want to visit a website URL, say http://www.domainname.com/index.html (as defined in directory index)before visit, it first check http://www.domainname.com/robots.txt,… (1 comment)

What is the difference between this and self in PHP OOP?   PHP classes can have static functions and static variables.Declaring class methods or properties as static make them accessible without needing an instance of the class or an object.   Static functions and variables in classes are not associated with any object, but associated…

PHP cURL functions with example cURL is stand for client URL It is a library (libcurl) which allows you to connect and communicate to many different types of servers with many different types of protocols. libcurl supports http, https, ftp, gopher, telnet, dict, file, and ldap protocols. libcurl also supports HTTPS certificates, HTTP POST, HTTP… (16 comments)

HTML form enctype Attribute The HTML form enctype attribute’s main purpose is to indicate how the form data/values should be encoded prior to it being sent to the location defined in the action attribute on the form.   for e.g. <form action=”scriptarticle/post_page.php” method=”post” enctype=”multipart/form-data”>   By default, form data is encoded to “application/x-www-form-urlencoded” so that…