Featured Post 1

Friday 22 June 2012

Meta Tags, SiteMap

Meta Tags/Meta Keywords

The tag or the name which is use to identify the title, describe, content or the keyword is the meta tag. The keyword which is used for the give an identification for the specific purpose is the meta tags. Well the meta tags are kept as under or the in between the head and the title tags. There are lots of meta keyword or the meta tags generators. In a list of such engines or the sites the submitexpress.com is the one.

Simple format of meta tags,

<meta name="Title" Content="........."/>
<meta name="Description" Content="........."/>
 <meta name="Keywords" Content="........."/> 

Yes its all.


Site Map

Its about the shortcut info of about the entire site content. It represents the whole site at one. Its really as like one page whole content. Well it is represented by the default url. So the default url is like...

default:www.ursite.com/feeds/posts/default


Well the better ie. is    VED-SEO



Monday 4 June 2012

Some relevancy that determines the rank


Most common factors search engine gerge the value of a link. Search engine have become more and more depended metricx about an entire domain rather than just then individual pages. Its why you will see new pages or those with very few links ranking highly. Simply because they are an important, trusted, well-linked-to domain.

1st factor. Internal vs External links



Search engines are likely to place more value on link XA than link BA & CA. Because the source page is on an external domain.
                       When search engines first began valuing links as a way to determine the popularity, importance and relevant of a document. They found classic citation based rule that what other says about you is far more important and trust for the what you say about yourself. Thus, while internal links( links that point from one page on your site to another ). Do carry some way but links form external sites matter far more.

this doesn't mean its not important to have good internal link structure or to do all that you can with your internal links. Its just mean that a site performance is highly depended on how others site on the web have cited you.



2nd factor. Anchor Text

Value of Anchor Text in search results,



Anchor Text is a critical metricx passed by links. Early, in anecdotal ( Not necessarily true or reliable ) or test results. It appears that link AB will actually pass more relevancy value then link XY. For the keyword phrase E-commerce organization.
An obvious wants for those SEO business Anchor Text is one of the biggest factor in the ranking overall. The appear of exact match anchor text is more beneficial then simply inclusion of the target keyword in an anchor phrase.



3rd factor. Page Rank( PR )

Whether they called StaticRank( Microsoft ), WebRank( Yahoo ),PageRank(Google ) or mozRank(Linkscape). Some forms of an iterative( relating to ), markov chain based link analyzing, algorithm is a part of all the engines ranking system. Pages rank an others uses the anologics that links are votes and that those pages which have more votes and more influence with the votes they casts.

Lets start some basic concept of the page rank which can be listed as follows.
i)   Every URLs these assigned quantity of page rank.
ii)  If there are "An" links a page each link passes that page rank divided by n. Thus the more link lower the amount PR is once flow.
iii) An iterative calculation that flows juice through web's intire link graph, dozen's of time is use to calculation for each URLs ranking scores.
iv) Representation like those shown in those google's page rank SEOmoz'g mozrank on a 0-10 scale are logarithmic( Thus a p-r or mozRank 4 has 8-10 times link important then a PR/mR 3 )



Page rank can be calculated on the page level link graph asigning page rank scores to individual URLs. But it can also apply to the domain level grpah which is how metricx like mozRank. Domain mozRank( DMR ) are derived. By counting only links between domains, DMR can be used to determine the importance of an entire site.

4rth factor.Trust Rank( TR )

The basics of trust rank or describing the pae of stand for combeting web spam with trust rank. ( ilpubs.stanford.edu:8090/770 ) the basic tenet of TR is that the webs good and trust fro the pages tent to be linked together and that spam is much more pervasive (sperading widely) outside this center. Thus, by calculating and iterative PR-like metric that only flows Joice form trust seed source, the metric like TR can be use to predictably strike, whether a site to be likely to be high quality or spam.

While the engine don't expose any data point around this particular metric its likely that some forms of the "distance form trusted seeds" logic is applied by logic algorithm. Another interesting point on TR is it even measures who links to known spam sites is likely also  part of the engines metric sites. As with page rank trust rank can be calculated on the both page level and domain level on graph. Linkscape uses this intitution to build mozTrust(mT) and domain mozTrust(DmT). So the key point is get links form high trusted sites and don't link to potential spams. You can check TR on (www.seomastering.com).



5th factor. Domain Authority.

Though the phrase domain authority is often discussed in SEO world a formal and universal definition doesn't exist. Most practitioners are used to describe a combination of popularity importance and trust wortiness calculated by the search engines and based largely ;on link data. Though some also fail the engines may use the age of the sites here as well. Search engines likely use scares about the authority of a domain in counting link. Thus despite the fuzzy language its worth mentioning a s a data point. The domain hue on links from are potentially just as important then the individual metric of the page passing the link.


6th factor. Diversity of sources.

In analysis of co-relation data number single metric has a positive co-relation with high ranking then the number of root linking domain. This appears to be both hard metric to manipulate for spam and a metric that includes true broad popularity and importance.
Although co-relation is not causation the experience of many SEO's along with impractical(based on) data suggest that a diversity of a domains linking to your site or pages has a strong positive effect on Rank link. By this logic it follows that earning a link form a site that already linked to you in the past is not as valuable as getting a link form and entirely a unique domain. This is also suggest that potentially links form sites pages who have themselves on diverse link profiles may be more trusted and more valuable then those form low diversity sources.

7th factor. Uniqueness of source+target

The engines have number of way to gorge and predict ownership and relationship between website. They can include...
i)   Large number of shared link.
ii)  Domain registration data.
iii) Share hosting IP address.
iv) Relationship information.

If the engines determine that a free existing relation of some kind could inhibit editorial quality of a link passing between two sites they may choose to discount or even ignore these. Anecdotal (assumption) evidence that link shared between networks of websites passing little value is one point many in the organic search feed point to earn this point.


8th factor. Location of a page.



Note: Search engines may analysis a page for a different section assign weight to content or link based on location.

Microsoft wa the first engine to reveal public data about their plan to do "block level analysis". Since then many SEO's have reported observing the impact of analysis like this form google, yahoo as well. The internal links on the footer of web pages may not provide the same beneficial results that those some links will wind place into top-header navigation. Others have reported that one way the engines have been fighting pervasive  link advertising is by diminishing the value that external link carry form sidebar, footer or web page.
SEO tent to agree on point taht links form the content is most valuable, both form the values the link passes for ranking and accidentally click through as well.



9th factor. Tropical Relevance

There are numerous ways the engines can run tropical analysis to determine weather two pages cover similar subject matter. years ago Google labs featured and automatic classification tool that could predict, based on a URL, the category and sub-category virtually any types of content.(From medical to real state, marketing, sports and dozens more). Its possible that engines may use this automatic tropical classification system to identify neighborhood around particular topic and counts link more or less based on the behavior they see as accretive (process of growth) to their quality of ranking results.




10th factor.Content and context assessment.

Though tropical relevance can provide useful information to engines about linking relationship its possible that more useful in determining the value. It should pass from the source to the target. In content/context analysis the engines attended to discern (to recognize) in a machine parseavle(analysis) way why a link exist on a page.
When links are meant editorially certain pattern arrives that tent to be embed in the content,link to relevant source, us accepted norms for HTML structure, word uses phrasing, language etc. Though detail pattern matching and potentially machine learning on large data set the engines may be able form distinct about when constitutes or legitimate and editorially given links that entered as an endorsement vs those that may be place through hacking. Those that are result the content licensing carries little over weight. Those that are pay for placement.

11th factor. Geographical Location.

the geographic of a link is highly depended on the perceived location of its host. Bot the engines particularly have been getting increasingly sophisticated about employing data points to ping points. The relevance of root domain, sub domain or sub folder. This can include...
i)   The host IP address location.
ii)  The country code TLD (Top Level Domain)
      {ie. .de.uk, .cu, .us, .pk, .in etc}
iii) The language of the content.
iv) Registration with local search system and regional directories.
v)  Associated with physical address.
vi) the geographic location of links to their sits.



12th factor. Use of Rel="Nofollow"

Although in the SEO world it feels like time ago since Nofollow appeared it actually only been around since January 2005 when Google announced it was adopting support for new HTML tag. Very simply "Rel-Nofollow" when attached to a link tells the search engine not to scribe any of the editorial ends Osmond or votes the would boost a page query independent ranking materials.

Some question exists in the SEO field as to whether and how strictly each individual engine follows the protocol. ie. Google may still pass some citations quality though wikipedia's external link despite the use of follow.




13th factor. Types of link.


Links can come in a variety of formats. the big three are.
i)  Straight HTML text link.
ii)  Image link.
iii) Java Script link.


Note: Not all links are treated equally. In both anecdotal (not sure) example and testing it appears that straight HTML links with standard anchor text passes the most value followed by image link with keyword rich alt text and finally Java script link.




14th factor. Other link target on the source page.


When a page links out externally both the quality and targets of the others links that exist on that page may be taken into account by search engines when determining how much link juice should pass.
The page rank like algorithm from all the engines dived the amount of juice pass by any given page by the number of links earn that page. In addition to this metric the engine may also concern the quantity of external domain a page points to as a way to gorge quality and values of these endorsement. ie. a page links to only a few external resources on a particularly topic spread out among the content that may be perceive differently then a long list of links pointing to many different external sites. One is not necessarily better or worse then the others. But its possible that engines may pass greater endorsement through one model then another.

The engines also likely to be looking at who else the linking page endorse having a link from a page that also links to low quality passes that may be considered Spam is almost certainly less valuable then receiving links form pages that endorse and link out to high quality, reputed domains and URL.


15th factor. Domain page and link specific penalty.

As nearly everyone in the SEO business is aware search engine applied penalties to site and pages ranking form lots of the ability to pass link endorsement all the way upto a full ban from their endisious. If a page or site has lost its ability to pass link endorsement acquiring links from it provides no algorithmic value from search engines. Beware that search engine sometimes penalty publically but continue to keep those penalties in consistence so systemic manipulators can't acquired sold data point who can get hit or not.

Sunday 3 June 2012

Google Basics

When you sit down at your computer and do a Google search, you're almost instantly presented with a list of results from all over the web. How does Google find web pages matching your query, and determine the order of search results?

In the simplest terms, you could think of searching the web as looking in a very large book with an impressive index telling you exactly where everything is located. When you perform a Google search, our programs check our index to determine the most relevant search results to be returned ("served") to you.

The three key processes in delivering search results to you are:








Crawling 


Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.

We use a huge set of computers to fetch (or "crawl") billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider). Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.

Google's crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.

Google doesn't accept payment to crawl a site more frequently, and we keep the search side of our business separate from our revenue-generating AdWords service.

Indexing 

Googlebot processes each of the pages it crawls in order to compile a massive index of all the words it sees and their location on each page. In addition, we process information included in key content tags and attributes, such as Title tags and ALT attributes. Googlebot can process many, but not all, content types. For example, we cannot process the content of some rich media files or dynamic pages.



Serving results 


When a user enters a query, our machines search the index for matching pages and return the results we believe are the most relevant to the user. Relevancy is determined by over 200 factors, one of which is the PageRank for a given page. PageRank is the measure of the importance of a page based on the incoming links from other pages. In simple terms, each link to a page on your site from another site adds to your site's PageRank. Not all links are equal: Google works hard to improve the user experience by identifying spam links and other practices that negatively impact search results. The best types of links are those that are given based on the quality of your content.

In order for your site to rank well in search results pages, it's important to make sure that Google can crawl and index your site correctly. Our Webmaster Guidelines outline some best practices that can help you avoid common pitfalls and improve your site's ranking.

Google's Did you mean and Google Autocomplete features are designed to help users save time by displaying related terms, common misspellings, and popular queries. Like our google.com search results, the keywords used by these features are automatically generated by our web crawlers and search algorithms. We display these predictions only when we think they might save the user time. If a site ranks well for a keyword, it's because we've algorithmically determined that its content is more relevant to the user's query.

SEO On Simple Way

Well the SEO means the search engine optimization the seo is the best standing position on the web server for the containing the better stand. for the seo the key word matching is especially focus on here.

Keyword is the word which is used to search the site yes....

First       Keyword is the URLs...
Second  Keyword is the Title...
Third     Keyword is the Description/contents...

Understand Yourself,,,,,,,,,,,,,,,,,,,,,

Well now,
              keyword density,
                                       The keyword density means the keyword reputation on the webpage content. So it must be the 6-12%.

SEM Course Features

Professional Search Engine Marketing Training Program :


» Search Engine Optimization (SEO)
» Google Adwords (PPC)
» Social Media Optimization (SMO)
» Keyword Research and analysis
» Search engine Optimization of the website
» Dynamic Optimization of the website
» Basics of SEO friendly website design
» Website architecture analysis
» Search Engine Submissions
» Directory Submissions
» Advanced Link Building and concept of Link Popularity
» Posting on Forums, Blogs and Free classifieds
» Competitor Analysis
» Search Engine Spam
» Optimizing For Google, yahoo and MSN

SEO Course Features

Search Engine Basic + SEO (On page and Off page) + Advanced SEO Techniques + Social Media Optimization + Other Important SEO Factors + Live Project + SEO Tools Usage

Search Engine Basic
» What is Internet Marketing / online marketing
» How Search Engine Works
» Search Engine History
» Google Commands
» Google Updates
» Page Rank
» Google Toolbar
» Working of Search Engines (Google, MSN, Yahoo etc)
» What is SEO
» Black Hat / White Hat / Grey Hat SEO
» Search Engine Spam
» Website Basics
» Basics of SEO friendly website design
» Site Structure Analysis

Research and Analysis

» Analysis of Website
» Website architecture analysis
» Industry Research
» Competitor Analysis
» Keyword Research and Analysis
» Target segmentation
» Finding Appropriate Keywords

On Page Optimization

» TitleTag Creation
» Meta Tags Creation
» Content Development Check
» Web Content Optimisation
» Image Optimisation
» Code Optimisation
» Density Analysis and Placement
» H1, H2, H3 Tags
» Anchor Text
» Footer
» HTML Validation
» Creation of XML / HTML/ Text Sitemaps
» Sitemap submission in Google and yahoo
» Google Webmaster Account Setup and management
» Google webmaster tools
» Yahoo Feed Submission

Off Page Optimization

» What is Link Building
» Types of link Building
» Link Building Formula
» Directory Submission
» search engines Submission
» Local search engines Submission
» Social Bookmarking
» Article Submission
» Press Release Submission
» Blog Submission
» Forum Postings
» Classified Posting
» Blog Commenting
» Question Answers Board

Advanced SEO Course

» Static & Dynamic Website Optimization
» Dream Weaver, Basic of HTML
» URL renaming/re-writing
» Broken Links checking and Fixing
» W3c Validation
» RSS
» Feedburner
» Alexa
» Local Business promotion
» Advanced Link Building and concept of Link Popularity
» Optimizing For Google, yahoo and MSN
» Dynamic website Optimization Technique
» Basics of SEO friendly website design
» Google Analytics setup and management
» Uploading and Managing Website through FTP Software(Filezilla, Cute FTP,Core FTP..)
» SEO Tools
» Panda Update
» Google Caffeine
» Recap

Social Media Optimization

Brand Building using Social Networking Sites
» Facebook
» Twitter
» LinkedIn
» Orkut
» Digg
» Delicious
» Reddit
» Stumble Upon

Promoting website using URL Shortening Methods
Advanced SEO Terms:
» LSI
» Sandbox Effect
» SILOS
» Search Engine Spam
» A/B Testing
» Canonicalization
» Cloaking
» Clustering
» Dead Links and Deep Link
» Geo Targeting
» Google Bombing
» Google Bowling
» Link Farm
» No Follow
» Siphoning
» Stemming

Overview of this SEO training course

While the basic principles of Search Engine Optimisation (SEO) are well-established, it is imperative that you differentiate yourself by using advanced (ethical) SEO techniques, which, in turn, are not subject to Google's spam filters.

This practical, hands-on workshop will provide you with a structured process in order to improve your results from your SEO efforts. We will help you to review your existing website optimisation approaches, analytics and tools against best practice approaches. We will be discussing and demonstrating advanced SEO techniques used by web sites at the top of the search results pages.

This small-group workshop format is constructed to enable peer-sharing of SEO techniques. Sites reviewed for SEO practices will include examples from different sectors - financial, travel, retail, publishing and B2B.



SEO Topics Include:
Learn how to perform keyword research that will attract your target market
Write content that will turn visitors into customers
Find out the importance of keyword placement on your web pages and where to actually place the keywords


Learn optimal techniques to get your pages indexed by the Search Engines.
Find out what it takes to monitor your progress and the tools to help you.
Find out about the tools offered by the search engines to increase your web site's success
What problems might your site run in to and how can they be fixed? Will talk about technical issues that come up
Discover what it takes to stay out of trouble with the search engines by learning about anti-spamming guidelines.


What is Link Popularity and why is it important to SEO.
Receive internal and external link building strategies that will make it easier for the search engines to find your web pages.
Learn about the missing element in SEO: Creativity and find out how you too can get your creative juices flowing!
What is Social Media Marketing and how can it help boost Search Engine rankings?


Understand how the personalisation of search results is changing the rules.
Find out how the major search engines are changing and how you can benefit from those changes


Why are database driven sites unique when it comes to SEO and how can we make sure we get maximum advantage from SEO?
Mega-site SEO - developing architectures that work for large and massive sites


Learn about web analytics and how to analyze your progress
Learn how to implement Google Analytics on your website and why this is a must!
Learn about competitive intelligence applications and how these tools can help increase your website's rankings


Learn about Mutli-Variate Testing and A/B Split testing
Learn about technologies that can improve your conversion rates and capture phone calls right from your web site.
Write content that will turn visitors into customers

How to make any Internet Download Manager (IDM) full version with serial key crack or keygen

Internet Download Manager is a download accelator. By using this software we are able download from the Internet at a high speed.But when we are downloading the software from its official site it is not a full version. That is a 30 day trial pack. After these days are past it will ask you to enter serial key or to buy. So here is the process to make any downloaded IDM full version.After making this it will be registered for lifetime and not ask to register it further.

Process:
1. Download latest version of IDM from here or If you already have IDM installed Update it [Process for update: Click on help on the menu bar and then click on check updates. It will search for updates and ask you to install]
2. If you like your installed version of IDM very much. Then don't update or skip the first step.
3. Now click on "registration" on the menu bar.
4. When you click on registration, Now a new window Will be open ask your First Name, Last Name, Email Address and Serial Key. See Screenshot
5. Enter your First name, Last Name, Email address in the required field.
6. In the serial key field enter any of the following given below

RLDGN-OV9WU-5W589-6VZH1
HUDWE-UO689-6D27B-YM28M
UK3DV-E0MNW-MLQYX-GENA1
398ND-QNAGY-CMMZU-ZPI39
GZLJY-X50S3-0S20D-NFRF9
W3J5U-8U66N-D0B9M-54SLM
EC0Q6-QN7UH-5S3JB-YZMEK
UVQW0-X54FE-QW35Q-SNZF5
FJJTJ-J0FLF-QCVBK-A287M

And click on ok to register.

7. After you click OK, it will show an error message that you have registered IDM using fake serial key and IDM will exit. See screenshot
Now the actual hacking process starts:

8. First of all go to "C:/" drive then go to "Windows" Folder and then go to "System32" folder and then go to "Drivers" folder and then go to "Etc" Folder.
Path is: C:\Windows\System32\drivers\etc
or Simply "C:\Windows\System32\drivers\etc" paste this (without quotes) in the address bar and hit enter. It will open the required folder.
9.in the Etc folder you will see the hosts file. Open the file with notepad.
Now copy the below lines of code and add to hosts file as shown below the image box :

127.0.0.1 tonec.com
127.0.0.1 www.tonec.com
127.0.0.1 registeridm.com
127.0.0.1 www.registeridm.com
127.0.0.1 secure.registeridm.com
127.0.0.1 internetdownloadmanager.com
127.0.0.1 www.internetdownloadmanager.com
127.0.0.1 secure.internetdownloadmanager.com
127.0.0.1 mirror.internetdownloadmanager.com
127.0.0.1 mirror2.internetdownloadmanager.com
10. After adding these codes, save the notepad file. And exit from there.

Now again repeat the 3rd step to 6th step.
Reboot or restart your PC. After restart Now open your IDM it will be full version and not ask you to register.

In windows vista and seven we are not able to save the "hosts" file due to security resion. For that first of all we have to take the ownership of the file to our logged user.So here is the process how to do so.

Also See: IDM free download with Crack and Keygen


Thank You.

SEO Class @ Ved

Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)
Search Engine Optimization & Search Engine Marketing (SEO) and (SEM)