Wednesday 16 March 2011

The Key To SEO That Attracts Traffic To Your Website:KEYWORD


Many websites are planned for the purpose of making cash; huge amounts of money. In November 2006 there were about 100 000 000 websites. By May 2007 this number of websites improved to over 118 000 000, according to Netcraft.com. Many of these sites, however, are not attracting enough traffic. Their content does not include well researched important keywords which are friendly to search engines. This article discuses the benefits of using SEO friendly web content.

Why do we need SEO friendly we content?

SEO friendly web content attracts guests to a website. These are the visitors who could website conversions. The main fact is that whether a site is designed to generate profits or not, with no visitors, it is useless.

How does Search Engine Optimization(SEO) help in attracting visitors to a website?

The most important items in attracting web traffic through SEO are Keywords. When people surf the web, they use keywords. Some keywords are more often used than others. If you find out which keywords are used so often, by searching the web, you could effectively and profitably use them in your web content. This helps the content become SEO friendly.

Spread your keywords and key phrases evenly and reasonably in your content. Make sure that your keyword density does not exceed 3 - 5% of the total number of words. When this is ready search engine crawlers or spiders will facilitate the indexing of you content for improved website ranking.

When people surf the web, and type the keyword found in your article, the source and part content are displayed together with many others which have the same keyword. There could, therefore, be chance that some visitors click on your link and arrive at your site. Keywords increase the visibility of your website.

Keyword phrases could be derived from keywords. Like keywords, key word phrase improve site visibility and consequently attract visitors to your website. It is the impressed visitors who could improve your site conversions.

Which popular keyword Tools can one use?

Among popular keyword tools are Google, Wordstream and Wordtracker. These can benefit one with information like number of searches made on a particular word over a specified period of time.

Many websites are unable to attract reasonable traffic because their web content is not SEO friendly. When keywords and phrases are well researched and spread rationally over content, they attract web traffic. This increases the visibility and conversion rates, making your site serve the purpose it was created for.Source

Thursday 10 March 2011

Basic SEO 4 Tips


SEO Tip 1: Your webpage should have at least 500 words of content. The longer value content you have, the better. It has been experienced that longer, keyword optimized contents get higher ranks in Google.

SEO Tip 2: The content should be informative and offer some value to the reader. Just think about yourself. Do you continue reading anything without value? Google loves useful, valuable content. You know: Content is King!

 SEO Tip 3: The content should be keyword optimized. The law is to have keywords within a density of 2 to 4%. If you apply to many keywords it can be hurtful, if you use less your content is not keyword optimized.

SEO Tip 4: Use title tags. This helps to give a good focus to your content and the readers know what they are reading. It is a very crucial SEO trick. When you use H1, H2 and even H3 tags and you have the keyword in it, your website is counted to have better on-page optimization factor.SEO

Wednesday 9 March 2011

SEO Techniques



Now a days’ optimization of web pages has become an important factor for webmasters and content developers. This optimization of the websites for search engine started in the mid-90s. Initially webmasters used to submit the address of the page to various search engines to crawl that page. After this the links need to be extracted from other pages and the returned in order found on the page is to be indexed. In this process a spider or crawler downloads the page and stores it into the search engines server. The indexer performs the second function. It starts extracting different information of the pages which contain words, words location and the weight for exact words. All these information are placed into a scheduler for crawling at the later date.

As time passed on owners started to know that how important is for a website to rank high on the search engine results. The term “Search Engine Optimization” came into use in the year 1997. Initially the search algorithms used to rely more on the information provided by the webmaster such as keyword Meta tag or index files in engines. Meta tags acts as a guide to each of the pages content. Using Meta data to the index pages was not fruitful enough because the choice of keywords in the Meta tag was not actually correct. Inaccurate data in Meta tags causes the pages to rank for irrelevant searches. Web content provider also manipulated a number of HTML source in order to rank higher on search engines.

Now days’ search engine does not rely on keyword density because search engines used to suffer from abuse and ranking manipulation. To give better results to the user, search engines had to adapt to make sure their results pages showed the most relevant search results.

Page ranking can be defined as a function of the quantity and strength of inbound links. Page ranking also estimate the chances of a given page will be reached by a given user who randomly searches in the web and follows links from one page to another. This means some links are stronger than others, as a higher page rank is more likely to be reached by the chance surfer.SEO

SEO SPIDER TRAFFIC


Search engine optimization (SEO) is the activity of improving the volume and quality of real time traffic to web pages or whole sites from search engines through natural search results for targeted keywords. Most often a website that gets a higher position or ranking in the search results is the one which the searchers will visit more often. Therefore, it is important that a spot is optimized to get real time traffic the right way to make it more search-engine friendly and get a higher rank in the search results.

Though SEO helps in boost real time traffic to a website, it should not be considered as the marketing of the site. It may be easy for one to pay and appear in the paid search results for a set of keywords, however, the concept behind the SEO technique is to get the topmost ranking without paying since your site is relevant to the search query entered by the user.

The amount of time that can be spent on optimizing a website to get real time traffic can variety from a few minutes to a long term activity. If your product or service is such that the keywords used are uncommon, then some general SEO would be sufficient to get a top rating. However, if your product or service industry is competitive or is in a saturated market, then it is important to dedicate considerable amount of time and effort for SEO. Even if one decides on simple SEO solutions, it is necessary to understand how search engine algorithms work and which items are crucial in SEO to get valid time traffic.

The marketing approach that SEO adopts for increasing a site’s relevance is consideration of how search engines work and what searches are performed by users. The SEO process attempts to concentrate its efforts on a site’s coding, presentation, structure, and also resolving issues that prevent search engine indexing program from spidering a website entirely.

Additionally, SEO efforts also include inserting unique content to web pages to make it more attractive to users and also to ensure that the search engine robots can easily index the content. Optimizing a site to get real time traffic can either be done by in house personnel or outsourced to professional optimizers who carry out SEO projects as a stand-alone service, like us - SEO Traffic Spider.

SEO tactics can be varied to achieve top ranking in search results. These fall into three categories (White-Hat SEO, Grey-Hat SEO, and Black-Hat SEO) depending on legitimacy and ethics.

White Hat SEO uses strategies which are considered ethically right and legitimate by SEO’s as well as search engines while spidering a site. It would be best to use this tactic as a means of achieving top ranking in the search results for real time traffic. These plans will not lead to penalties which are generally imposed when Black Hat SEO is used and even though it may take some amount of time to reap the rewards of this process, it will surely bring in promising results in terms of high rakings.

Grey Hat SEO uses tactics which may be considered legitimate if they are used correctly. However, these have been subjected to misuse by unethical SEO’s and webmasters. It is possible that your website will not get penalized or banned for using this strategy and it may be basic to use this process at times, however, webmasters are careful in the use of these tactics. This is because search engines may penalize tactics that are subjected to abuse even though they are used in legitimate forms.

Black Hat SEO employs techniques that are considered illegitimate and ethically incorrect and which have been or will surely be penalized and cause your website to be banned by search engines. Since Black Hat SEO is subject to abuse widely because it tends to harm search engine user incident, it is highly recommended to avoid using it as a means of attaining high ranking.SEO SPIDER 

Other SEO BOOK AND SOFTWARE

SEO BOOK CLICK HERE TO DOWNLOAD SEM-BOOKLET

SEO BOOK CLICK HERE TO DOWNLOAD SEO MADE EASY BOOK

SEO BOOK CLICK HERE TO DOWNLOAD Adsense Secret

SEO BOOK CLICK HERE TO DOWNLOADthe Little Joomla SEO Book

SEO BOOK CLICK HERE TO DOWNLOAD OTHER 3 SEO BOOK[Password www.ilmedunya.co.cc]

SEO SOFTWARE CLICK HERE TO DOWNLOAD SEO SURF SETUP


Thursday 3 March 2011

Hacking a Web Server


With the advent of Windows 2003 and IIS 6.0 there was a sharp turn in the way hosting services were being provided on Windows display place few years back. Today, web servers running on Internet Information Services 6.0 (IIS 6.0) are highly popular worldwide – thanks to the .NET and AJAX revolution for designing web applications. Unfortunately, this also makes IIS web servers a popular target amongst hacking groups and almost every day we read about the new exploits being traced out and patched. That does not mean that Windows is not as secured as Linux. In fact, it’s good that we see so many patches being released for Windows platform as it clearly shows that the vulnerabilities have been identified and blocked.

Many server administrators have a hard time coping up with patch management on multiple servers thus making it easy for hackers to find a vulnerable web server on the Internet. One good way I have found to ensure servers are patched is to use Nagios to run an external script on a remote host, in turn alerting on the big screen which servers need patches and a reboot after the patch has been applied. In other words, it is not a difficult task for an intruder to gain access to a vulnerable server if the web server is not secured and then compromise it further to an extent that there is no option left for the administrator but to do a fresh OS install and restore from backups.

Many tools are available on the Internet which allows an practiced or a beginner hacker to identify an exploit and gain access to a web server. The most common of them are:

IPP (Internet Printing Protocol) - which makes use of the IPP buffer overflow. The hacking application sends out an actual string that overflows the stack and opens up a window to execute custom shell code. It connects the CMD.EXE file to a specified port on the aggressor’s side and the hacker is provided with a command shell and system access.

UNICODE and CGI-Decode – where the hacker uses the browser on his or her computer to run malicious scripts on the targeted server. The script is executed using the IUSR_ account also called the “anonymous account” in IIS. Using this type of scripts a directory transversal attack can be performed to gain further access to the system.

Over these years, I’ve seen that most of the time, attacks on a IIS web server result due to poor administration, lack of bit management, bad configuration of security, etc. It is not the OS or the application to blame but the basic configuration of the server is the main culprit. I’ve outlined below a checklist with an explanation to each item. These if followed correctly would help prevent lot of web attacks on an IIS web server.

Secure the Operating System
The first step is to secure the operating system which runs the web server. make certain that the Windows 2003 Server is running the latest service pack which includes a number of key security enhancement.

Always use NTFS File System
NTFS file system provides granular control over user permissions and lets you give users only access to what they absolutely need on a file or inside a folder.

Remove Unwanted Applications and Services
The more applications and services that you run on a server, the larger the attack surface for a potential intruder. For example, if you do not need File and Printer sharing capabilities on your shared hosting platform, disable that service.

Use Least Privileged Accounts for Service
Always use the local system account for starting services. By default Windows Server 2003 has reduced the need for service accounts in many instances, but they are still necessary for some third-party applications. Use local system accounts in this case rather than using a domain account. Using a narrow system account means you are containing a breach to a single server.

Rename Administrator and Disable Guest
Ensure that the default account called Guest is disabled even though this is a less privileged account. Moreover, the Administrator account is the favorite targets for hackers and most of the malicious scripts out there use this to exploit and vulnerable server. Rename the officer account to something else so that the scripts or programs that have a check for these accounts hard-coded fail.

Disable NetBIOS over TCP/IP and SMB
NetBIOS is a broadcast-based, non-routable and insecure procedure, and it scales poorly mostly because it was designed with a flat namespace. Web servers and Domain Name System (DNS) servers do not require NetBIOS and Server Message Block (SMB). This protocol should be disabled to reduce the threat of user enumeration.

To disable NetBIOS over TCP/IP, right click the network connection facing the Internet and select Properties. Open the Advanced TCP/IP settings and go to the WINS tab. The option for disabling NetBIOS TCP/IP should be visible now.

To disable SMB, simply uncheck the File and Print Sharing for Microsoft Networks and Client for Microsoft Networks. A word of caution though – if you are using network shares to store content skip this. Only execute this if you are sure that your Web Server is a stand-alone server.

Schedule Patch Management
Make a plan for patch management and stick to it. Subscribe to Microsoft Security Notification Service (http://www.microsoft.com/technet/security/bulletin/notify.asp) to stay updated on the latest release of patches and updates from Microsoft. Configure your server’s Automatic Update to notify you on availability of new patches if you would like to review them before installation.

Run MBSA Scan
This is one of the best way to identify security issues on your servers. Download the Microsoft Base Line Security tool and run it on the server. It will give you details of security issue with user accounts, permissions, missing patches and updates and much more.

That’s it to the basic of securing the operating system. There are more fixes which can be performed for further securing the server but they are beyond the scope of this article. Let’s now move on to securing the IIS web server.

IIS 6.0 when setup is secured by default. When we say this, it means that when a fresh installation of IIS is done, it prevents scripts from running on the web server unless specified. When IIS is first installed, it serves only HTML pages and all dynamic content is blocked by default. This means that the web server will not serve or parse dynamic pages like ASP, ASP.NET, etc. Since that is not what a web server is meant to do, the default configuration is changed to allow these extensions. Listed below are some basic points that guide you to securing the web server further:

Latest Patches and Updates
Ensure that the latest patches, updates and service packs have been installed for .NET Framework. These patches and updates fix lot of issues which enhances the security of the web server.

Isolate Operating System
Do not run your web server from the default InetPub folder. If you have the option to partition your hard disks then use the C: drive for Operating System files and store all your client web sites on another partition. Relocate web root directories and virtual directories to a non-system partition to help protect against directory traversal attacks.

IISLockDown Tool
There are some benefits to this tool and there are some drawbacks, however, so use it cautiously. If your web server interacts with other servers, test the lockdown tool to make sure it is configured so that connectivity to backend services is not lost.

Permissions for Web Content
Ensure that Script Source Access is never enabled under a web site’s property. If this option is enabled, users can access source files. If Read is selected, source can be read; if Write is selected, source can be written to. To ensure that it is disabled, open IIS, right click the Websites folder and select Properties. Clear the check box if it is enabled and propagate it to all child websites.

Enable Only Required Web Server Extensions
IIS 6.0 by default does not allow any dynamic content to be parsed. To allow a dynamic page to be executed, you need to enable the relevant extension from the Web Service Extensions property page. Always ensure that “All Unknown CGI Extensions” and “All Unknown ISAPI Extensions” are disabled all the time. If WebDAV and Internet Data Connector are not required, disable that too.

Disable Parent Paths
This is the worst of all and thanks to Microsoft, it is disabled in IIS 6.0 by default. The Parent Paths option permits programmers to use ".." in calls to functions by allowing paths that are relative to the current directory using the ..\notation. Setting this property to True may constitute a security risk because an include path can access critical or confidential files outside the root directory of the application. Since most of the programmers and third-party readymade applications use this notation, I leave it up to you to decide if this needs to be enabled or disabled. The workaround to Parent Paths is to use the Server.MapPath option in your dynamic scripts.

Disable Default Web Site
If not required, stop the Default Web Site which is created when IIS 6.0 is installed or change the property of Default Web Site to run on a specific IP address along with a Host Header. Never keep it running on All Unassigned as most of the ready-made hacking post identify a vulnerable web server from IP address rather than a domain name. If your Default Web Site is running on All Unassigned, it means that it can serve content over an IP address in the URL rather than the domain name.

Use Application Isolation
I like this feature in IIS 6.0 which allows you to isolate applications in application pools. By creating new application pools and assigning web sites and applications to them, you can make your server more efficient and reliable as it ensures that other applications or sites do not get affected due to a faulty application running under that pool.

Summary
All of the aforementioned IIS tips and tools are natively available in Windows. Don't forget to try just one at a time before you test your Web accessibility. It could be disastrous if all of these were implemented at the same time making you wonder what is causing a problem in case you start having issues.

Final tip: Go to your Web server and Run “netstat –an” (without quotes) at the command line. Observe how many different IP addresses are trying to gain connectivity to your machine, mostly via port 80. If you see that you have IP addresses established at a number of higher ports, then you've already got a bit of investigating to do.Article
About Author
Vishal Vasu has been working on Microsoft Technologies since 1995. Currently he runs his own business of providing technical support and remote server administration under the brand of Hosting Support Guru. In his spare time, Vishal loves to provide level 3 support on Windows platform. He can be reached from his personal website at http://www.vishalvasu.com.

Secret Of Hacking An HYIP Program


HYIP can be a superb way to experience success in investment. HYIP, also known as a high yield investment program, can be quite risky as the whole HYIP market. But at the same time if you use it right it can be quite profitable. So let me show you how you can hack this type of program.

The one way to get out of risk is to expand your asset into a number of HYIPs regularly. And besides that, you cannot afford to keep any interest in your investment accounts for compounding. Removing them to your e-gold account is a wise thing to do. You can easily browse HYIPs from HYIP rank and monitor sites to get an idea about their authenticity. Some of these sites even sends catalogs of HYIPs with all the relevant comments, the payment standing on each HYIP and of course the rating. You must remember, that your investment and the consequent profit are not guaranteed at all in case of HYIPs. There is every possibility that you can even lose the principal amount, so be prepared.

Take a note on some of the must do things regarding High soft Investment Programs. Be dead sure of not having all your money into an undersized HYIP program. No matter how promising they may look at the outset, do not get lured away. If you are ready to invest a large sum of money then do make sure to enquire whether the company offers any capital security against it.

Invest your capital into as many programs as you can. It makes more brains to do the above than settling down with two to three small HYIP programs with huge sums. Remember, you need to focus on the plan and not the programs for that matter. As for smaller programs make it a point not to reinvest extra money before you have been paid back. Moreover, with HYIP programs it is advisable to plough back smaller profits from time to time.

High Yielding Investment Programs can provide you with anything between 0.7 and 5% per day to say the least. And as for the longevity of a typical HYIP, it generally does not go beyond one year. HYIP forums come along with rating systems. Programs that have short investment-durations should be preferred. Additionally, the programs that pay back the invested sum must be chosen. You will have to keep bad programs out of our way. Try to consider through the sites before you plunge into any kind of investment. The fact of the matter is that you will have to steer clear of "too good to be true" offers.

This steps help you to avoid hyip losses. Now I earn more than $4000 thousand a month using my favourite golden rules.

About Author

An HYIP expert David Vagner works on HYIP market more than 2 years. He created his own golden rules which help him to be successful Article

Monday 28 February 2011

Adsense SEO TIPS SEO URDU BOOK Software And Adsense Blackhat Edition

What Is Google Adsense

How To add OR Create Meta Tags In Blogspot

Tips For Creating Backlinks

How To Generate Traffic To Your Blogspot

Google Adsense
                            Google Adsense is one of the popular online earning program in the world i like so much google adsense because they are paying a lot..They are providing Free earning program at home.Google adsense in my way is basically provides adds to display in your web or blogspot or other.In Google Adsense  Basic thing is you need is web or blogspot.Blogspot is free program.So What You have  to do to join The Adsense program,first you need a blog or web with good content your own content,write any thing about any product,write an article, which you think you can write well.So After writing 5 10 minimum article [ you own article.Don't copy any article from other site] so after writing article go to blogspot.com and register account,Then post yout article in ur blogspot spot.
      So Now what you have to do to join adsense program To Go adsense.com and click On Sign up process.New In Website Url Write your Website name or blogspot name their after that fill out the whole form in Account Type Select Individual and go on to fill all from wait few days the confirmation will come And You are Ready To Earn..

In Google Adsense  Two important Things Is Meta tags And Backlinks both Are Important specially backlinks because backlinks Increase your page rank..

How To add OR Create Meta Tags In Blogspot 

Meta Tags Is Important For Keyword And other descriptions.How to Create Meta tags in blogspot Just See This First Go To the your Dashboard Of ur blogpsot Then Click  on Design Then You Will See This 
Page Eelements  EDIT HTML Template Designer 

Click on EDIT HTML

Then You  Will See This line in the HTML CODE Search This Line
 <b:include data='blog' name='all-head-content'/>
<title><data:blog.pageTitle/></title>
So You Find It So Now U Have Add This Line after This line
<b:include data='blog' name='all-head-content'/>
<meta content='DESCRIPTION HERE ' name='description'/>
<meta content='KEYWORDS HERE ' name='description'/>
<meta content='AUTHOR NAME ' name='author'/>
<title><data:blog.pageTitle/></title>
Description ur blog description
Keyword write ur blog Keyword
Author Write Your  name
Finished Now Ur Meta Tags Are Created U can Generate Meta Tags By Meta Tags generate Tool just Go The google.com And Search Meta tag Generator You Find Lots Of site Of Meta Tags Generator..
If you wanna Check That u  set meta tags Write Or not Just Go to google.com And write meta tags analyzer tool And check it ..


Tips For Creating Backlinks


1.Start Pay per advertising Google Adwords is bes For pay per advertising


2.Post your Comment to Other blopspot article or other things.


3.Do Article Marketing.


4.Post your Blogspot article Link To other Social directories .


How To Generate Traffic To Your Blogspot

1.Post Comments To other Blogspot

2.Tell Friends and other

3.Post Comments On Forum

4.Post Classifieds adds

5.Post Comments On Top Rank Websites

6.Pay par Advertising Is best

7.Publish Your Website name In bussiness Card And other Things


Download SEO URDU BOOK CLICK HERE 

SEO AdSense BlackHat  Edition Book In ENGLISH CLICK HERE

Seo Elite 4 Click Here

Download SEO Directory Submitter Click Here 

Download SEO RSS SUBMIT Click Here


Download SEO Blog Comment Poster Click Here


Download SEO Fast Blog Finder Click Here


Download SEO Fast Blog Finder Crack Click Here


Download Refferal Secret Book Click Here

Automated Pc Restore


Before the automatic PC restore feature Microsoft has proved that impossible things can be turned into possible. A user can now restore all important files and programs on PC to earlier point when everything was running properly. This saves precious time and money compare with reinstalling the applications and/or the operating system.

Importantly, restore functions allow to rollback system files, installed programs, local user profiles, Windows file protection files, registry keys and other important components on a computer to a workable state. Apart from this, the COM+, IIS Metabase, boot files, dynamic system files and WMI databases can be roll back. The restore feature helps undoing changes without affecting any personal files, i.e. user’s emails, word documents, messages, videos or music lists, pictures and bookmarks. The restore process is carried out without re-installing operating system or loosing data files.

Automated PC restore feature must be used after trying complete methods of troubleshooting as it can change system files and registry entries. It can replace even more files than needed for restoration. The feature is a part of Microsoft's Windows operating system like Me, XP, Vista and Windows 7. Initially, it was introduced in the Windows XP operating system.

The computerized PC restore feature workings better on Microsoft Windows 7 when compare with previous versions. By creating more restore points, it helps the users to check out the files which were removed or added when PC restore feature was completed. For more security purpose, Windows Backup is suggested. These points include information about registry settings and system information which is needed by Windows.

This helps to secure your personal files in computer system. The Microsoft Vista operating system is backed with Shadow Copy technology. This technology is an improved interface over Microsoft XP operating system’s which is based upon file filter technology. The shadow copy knowledge is based on the block-level changes in files while Windows XP is based on a file filter drivers.

To conclude, Microsoft’s in service systems like Windows XP, Vista, and 7 create restore points automatically. These points don't hurt manually created points prior to installing or uninstalling of hardware/software.

iPod Made Easy Communication


The iTouch was industrial primarily as an advanced version of the iPod that allowed for more features than just storing music and movies and photo media. However, the iPod in reality serves as a great device for communication, one that is almost on a par with the features and services of a phone without the contract. Though it does not have actual phone capabilities, it comes with just enough features to make it a useful device for communicating. For example, you can easily connect the iTouch with Internet connection, wi-fi networks, and other wireless devices that it will pick up automatically. Once it has access to Internet, you have the ability to open the doors of communication wide.

You can access the Internet, including your email, which allow you to compose messages to citizens and check your own messages. This allows you to keep in touch on the go and send and receive important things wherever you have Internet. You will get pop up notifications so that you know exactly when to check your email. Also, the App store allows you to prefer many applications that make communicating an easy feat to accomplish with this device.

For example, the App store offers a Facebook and Twitter submission, both free of charge. These apps place the social networks on your iTouch so that with one click you will be logged into your social networking website and will be able to read up on peoples' updates and make your own updates no matter where you are, as long as you are connected to the Internet. You can keep in touch with people through those applications in many ways. You can chat with them through the chat features, leave messages, write on walls, make post comments and Tweets and do all of the things that you could do from your computer or with a smart phone.

Also you can download other apps, such as Text Free, which allows you to text without any charge to other peoples' phones or iTouches or to anyone who has the application. Text Free supplies you with a personal number that you can share with others so that you can send and receive texts even if you do not have a phone plan. The AIM app allows you to install AOL Instant Messenger straight to your iTouch so you can sign onto your account with a single tap and be connected to all your friends. This will let anyone reach you at any time, as you can keep yourself signed on at all hours and receive notifications and alerts throughout the day.

Finally, you can even install Skype to your iTouch which will allow you to be able to have Skype to Skype conversations with your device. You can make calls to other users of Skype and use your headphones to listen. All iTouches that are above second generation come with an installed microphone so you can talk to people using your iTouch device. A device that seem the ideal music and media storage unit is now an excellent communication device, as well.

Histroy of Radio Short


It is imagine that nearly every household in the United States has at least one radio. The invention of the radio was dependent upon two previous discoveries: the electromagnet and telegraph.

The electromagnet was discovered in 1825. This sighting opened the doors to global communication! Five years later, Joseph Henry successfully transmitted an electric current via wire which was stretched over a mile and which caused an electromagnet to activate the sounding of a timer. Thus, the exciting telegraph was born.

One of the most recognizable names associated with the telegraph is Samuel Morse who is most notable for the series of dots (brief sound) and dashes (more sustained sound) which was used to transmit messages by alphabet code (thus known as Morse Code). Telegraph became the sole means of rapid long distance communication until 1877 and the invention of the telephone.

Batteries are interesting, aren`t they? Used by the general consumer, they are small, but powerfully packed instruments which produce power used for many cameras, alarm clocks, radios, and other devices. What makes this power production possible? In the case of the telephone and it`s early design and use, batteries produced the essential source of power for the electromagnet.

Batteries have two ends to which one is assigned a `+` or positive, and the other a `-` or negative. When a battery operated device is switched on, the electrons which are produced by the batteries, quickly move from the negative side to the positive side of the batteries. Something was needed to interrupt this rapid flow of electrons or find the batteries totally expended in a brief time.

In order to accomplish this, a wire is often inserted between the positive and negative terminals and a `load`, such as a radio, creating a small magnetic field in the wire. The electromagnetic waves which are present now, have the means to transmit sounds (speech, music, and so on) as well as visual images undetected by sight through the air.

There are several scientists who must be mentioned as essential to the telephone as it is known to most of us. Mahlon Loomis created wireless telegraph. Guglielmo Marconi, proved the possibility of radio communication. In 1985 Marconi, transmitted and received a radio signal. Using the Morse alphabet, he sent the first wireless signal which was transmitted across the English Channel and in due time, he was able to receive the Morse letter S which began in England and reached Newfoundland which became the beginning of transatlantic radiotelegraphy (1902).

Wireless signals gained far reaching use as a means of communication for rescue work when an accident or disaster occurred at sea. In 1899 the United States Army began utilize wireless communication which originated from a lightship off blaze Island. The US Navy was about two years behind the Army in utilize wireless telegraphy.

In 1903, President Roosevelt (Theodore) and King Edward VII communicated via this new and improving technology. The well known Robert Perry, using radiotelegraphy, conveyed the note that he had `found the Pole`.

The first AM radio entered the world of telecommunication in the early 1900`s. This device made the use of somewhat weak waves possible for communication. This was the time when the term `radio`, as we know today for radio strategy, began to be used.