PhatWare Releases WritePad Nederlands for iPad

Mountain View, California – PhatWare Corporation, a leading provider of software products and professional services for mobile and desktop computers, announces today the immediate availability of WritePad for iPad Dutch Edition. WritePad Nederlands for iPad is the first iPad product to offer natural handwriting recognition input in the Dutch language in addition to keyboard entry. Dutch is the sixth language offered by WritePad, which is also available in English, French, German, Spanish, and Portuguese.

WritePad is an advanced note-taker for iOS, which converts practically any handwriting into computer text. Notes created with WritePad can be sent via email or SMS, Tweeted, saved, posted on a Facebook Wall, printed, exported as PDF, translated to other languages, synchronized with Dropbox, and exchanged directly between two or more iOS devices. WritePad also features integration with events, contacts, maps, and other iPhone resources.

With WritePad, users can:
* Create and edit text documents using the advanced handwriting recognition engine or iPhone keyboard for text entry in landscape or portrait mode
* Improve productivity by utilizing inline gestures, spell checker, context analyzer, and shorthand features
* Improve overall handwriting recognition quality with the Statistical Analyzer by addressing common recognition errors. If this feature is enabled, WritePad will learn the user’s own handwriting style
* Email, Tweet, SMS, Print, or post Facebook updates directly from WritePad
* Synchronize WritePad documents with Dropbox, Evernote, and iTunes or upload documents to Google Docs
* Change WritePad’s appearance by manipulating text, page, and ink colors using the customizable Styles feature

Device Requirements:
* Compatible with iPad
* Requires iOS 4.3 or later
* 12.9 MB

Pricing and Availability:
WritePad Nederlands 5.2 is $9.99 USD (or equivalent amount in other currencies) and available worldwide exclusively through the App Store in the Productivity category.

PhatWare
WritePad Nederlands 5.2
Purchase and Download
Screenshot 1
Screenshot 2
App Icon

Founded in October 1997, PhatWare Corporation is a leading provider of easy to use powerful software products and professional services for the mobile and desktop computing marketplace. PhatWare specializes in handwriting recognition, digital ink, note taking and database, and network management software development. PhatWare’s products include such popular titles as CalliGrapher(R), PenOffice(R), PhatNotes(TM), PhatPad(TM), WritePad(TM), and others. PhatWare Corporation is a Microsoft Certified Partner, and Intel Software Partner. Copyright (C) 1997-2011 PhatWare Corporation. All Rights Reserved. Apple, the Apple logo, iPhone and iPod are registered trademarks of Apple Inc. in the U.S. and/or other countries.

localhost

In computer networking, localhost (meaning this computer) is the standard hostname given to the address of the loopback network interface. The name is also a reserved top-level domain name[1] (cf. .localhost), set aside to avoid confusion with the narrower definition as a hostname.
On modern computer systems, localhost as a hostname translates to an IPv4 address in the 127.0.0.0/8 (loopback) net block, usually 127.0.0.1, or ::1 in IPv6.[2]
Localhost is specified where one would otherwise use the hostname of a
computer. For example, directing a web browser installed on a system
running an HTTP
server to http://localhost will display the home page of the local web
site, provided the server is configured to service the loopback
interface.
Communicating with the loopback interface in an identical manner as
with another computers on the network, but bypassing the local network
interface hardware, is useful for the purposes of testing software.
Connecting to locally hosted network services, such as a computer game server, or for other inter-process communications, can be performed through loopback addresses in a highly efficient manner.
The Internet Engineering Task Force (IETF) Internet Standard document STD-2 series (e.g., RFC 1700) reserved the 127.0.0.0/8 address block for loopback purposes.[3] until such definitions were updated exclusively through the Internet Assigned Numbers Authority (IANA) website.[4] A later IETF document, Special-Use IPv4 Addresses (RFC 3330) describes the usage of the IPv4 address block 127.0.0.0/8 for loopback purposes.[5] It is therefore excluded from assignment by a Regional Internet Registry or IANA.
For IPv4 communications, the virtual loopback interface of a computer system is normally assigned the address 127.0.0.1 with subnet mask 255.0.0.0. Depending on the specific operating system in use (notably in Linux) and the routing mechanisms installed, this populates the routing table of the local system with an entry so that packets destined to any address from the 127.0.0.0/8 block would be routed internally to the network loopback device.
In IPv6, on the other hand, the loopback routing prefix ::1/128 consists of only one address ::1 (0:0:0:0:0:0:0:1
in full notation), the address with a one at its least significant bit
and zero otherwise) is explicitly defined as the loopback address,[6] though additional addresses may be assigned as needed to the loopback interface by the host administrator.
Any IP datagram
with a source or destination address set to a loopback address must not
appear outside of a computing system, or be routed by any routing
device. Packets received on an interface with a loopback destination
address must be dropped.
One notable exception to the use of the 127/8 network addresses is their use in Multiprotocol Label Switching (MPLS) traceroute error detection techniques (RFC 4379) in which their property of not being routable provides a convenient means to avoid delivery of faulty packets to end users.

10 SEO Mistakes to Avoid

When you’re starting off in SEO there’s a lot of conflicting information about what to do and what not to do. There’s a constant battle between White Hat and Black Hat and which is more effective.
It’s pretty easy to get frustrated at how long it actually takes to achieve a decent ranking. This can lead to taking the wrong advice. I learned by trial and error the SEO mistakes listed below.

Here are some rules to follow if you DO NOT want your site to rank in search engines.

1. Build a Flash Only Website

The days of Flash only websites are probably behind us (thank God). Having said this, as a web developer it’s surprising how often I’m asked, ‘Does that include Flash?

Search engines cannot read content embedded in Flash files so you shouldn’t use Flash to build websites. It’s perfectly all right to have a flash feature box or slideshow or something like that so long as the rest of your site is built with HTML.

2. Hide all Your Content in Images

This is another sin of the past, although you do still see it occasionally. If you embed your navigation and page copy in images, search engines will not be able to identify this content. Use text-based navigation and semantic mark-up instead.

3. Use Excessive JavaScript/AJAX

Search Engines do not understand JavaScript/Ajax. If you use too much of this and in particular in your navigation, you’re preventing your content being indexed. Use text-based HTML/CSS navigation instead.

4. Copy Someone Else’s Content

You’re not much of a writer and no one will ever know right? I’m afraid not. Duplicate content is one of the biggest sins you can commit as far as search engines are concerned. Not only do you prevent your site from ranking, but you also impact the site you copied the material from.

You should either write all content yourself or hire somebody else to do it for you.

Editor’s Note: For an alternative view read Jill Whalen’s article “There Is No Duplicate Content Penalty”

5. Stuff Your Content with Keywords

Including the same keyword phrase in the Page Title, the Heading, three times in the copy, bolded, in a link and in an image alt tag is an example of keyword stuffing.

This is like telling the search engine, ‘I’m trying to manipulate you into ranking me higher’. They won’t rank you higher, they’ll penalize you instead.

Include your keyword phrase, but do it naturally. Write primarily for humans then worry about search engines.

6. Use Automated Directory Submission Software

There are loads of software programs to automate the directory submission process. Don’t use them.

First of all directory submissions are not that valuable in terms of PageRank anymore. Secondly you’ll end up with thousands of similar Titles and Descriptions that will mark you as a cheater as far as search engines are concerned.

Submít to well-established directories, write unique Titles and Descriptions and try to focus on local and niche directories.

7. Participate in Link Exchange Programs

Often you see websites with a ‘Links’ or ‘Resources’ page with a list of links a mile long. Often these have no relevance to the content of the site and have been reciprocated on an identical page on the corresponding site.

There are a number of problems with this:

* Reciprocal links are less valuable than one way links.
* Links from pages containing hundreds have no value.
* It looks like you are trying to cheat the search engines.
* It looks unprofessional to users.

Reciprocal links are fine in moderation. Try to only swap links with relevant organizations and try to link from, and get links from inside paragraphs of related text in articles or blogposts.

8. Use the same Page Title and Meta Description Across Your Site

If you do this, you are basically saying that all your webpages are the same. Search engines index webpages and not websites; this is why you should have a unique, accurate Page Title and Meta Description for every page. Include your keywords in the Page Title in particular but don’t go crazy with the Meta Description.

9. Focus on the Wrong Keywords

Often when companies hire an SEO they’ll tell them they want to rank number one in Google for a ‘short-tail’, high competition keyword phrase. For instance an accountant might want to rank for the term ‘accountant.’

I’m afraid it doesn’t really work like that. It takes a while for any site to get off the ground and this is why it is important to focus on ‘long-tail’ keyword phrases. This is achieved by adding modifiers to the original phrase like ‘tax accountant’ or ‘tax accountant manchester.’

By focusing on less competitive phrases you can get instant traffic while setting yourself up to compete for more competitive phrases in the future.

10. Don’t Use SEF URLs

A SEF (Search Engine Friendly) URL is really a human friendly URL. What I mean by this is one that contains actual words and not a list of numbers and symbols.

SEF URL: http://pupul.org/web-design-and-development/cms/wordpress-websites/

Non SEF URL: http://pupul.org/?page_id=330

The first example contains words that relate to the content of the page as well as what category the page is in. The next example contains the article id which is of little use to anyone.

Who Keeps Spreading Silly SEO Stupidity, and Why?

Not a week goes by where a reader or a client doesn’t ask me a question based on some bad SEO advice they heard or read somewhere. Most of the time they don’t know it’s bad advice. They assume that if they read it in a blog, went to a seminar, listened to a webinar or even discussed it with a company that provides SEO as a service, the advice must be solid. Sometimes (usually if they’re a long-term HRA reader 😉 they may think it sounds a bit fishy, and smartly ask for my opinion.

While it’s true that among SEO industry veterans there can be disagreement about what works and what doesn’t, there are some SEO tactics that have been known by all who have even the slightest bit of intelligence to be useless. And yet they still crop up as SEO advice — all the time!

Just last week I got an email from a longtime HRA subscriber who told me that his friend had attended a seminar where the speaker told them they should submít their website to search engines on a monthly basis, and proceeded to provide them with the name of a tool that would do so for ónly $99 per month!

And just yesterday, someone emailed me for my opinion when she read in another email newsletter that Google only indexed the first 100 words on a page!

When I hear this sort of irresponsible and incorrect information being spread to impressionable Internet marketers in the making, I get irate. In fact, here’s what I said in response to the question about submittíng sites to the search engines:

“I honestly can’t believe that there could still be, in 2011, someone who would speak to an audience on any form of Internet marketing who would recommend submitting to search engines, let alone one that would recommend spending $99 (or even 10 cents) a month to do so. In fact, it enrages me. That person who spoke must be a sales rep for that [submission tool] company, and he or she should be thrown out of the business and not allowed to speak on the topic ever again.”

While it is likely that the speaker was a paid sponsor there to peddle his putrid website submission tool to clueless newbies, I started to wonder about others who spread this sort of silly SEO stupidity, and why.

Here’s what I came up with:

It’s Easy to Implement

This is likely the main reason that SEO stupidity spreads like wildfire, and the reason that is the basis for all the other reasons. SEO — that is, real SEO — is hard. Stupid SEO is easy. (So what if it doesn’t work? That’s just a small inconvenience!

Incompetent SEOs have a vested interest in perpetuating silly SEO. The more people who think that SEO is about submitting to search engines or about meta keywords, the more people will sign up for their boondoggle services and the more ill-gained money they’ll have lining their pockets.

Old Articles Get Recirculated

There are more than 15 years’ worth of old, out-of-date SEO articles from a variety of sources that may look credible on the surface (and perhaps were once), but that provide advice that has nothing to do with SEO in the 21st century. Just do a Google search for “Should I submit to search engines?” and you’ll see all sorts of fun stuff. Even Google’s Webmaster Guidelines point to their Add-URL page, which is all but worthless.
Forum Circle Jerks

There are, surprisingly, still a lot of SEO forums in the online world, most of them full of newbies. While it’s great that new people in our industry want to learn SEO, they need some professional and competent SEOs there to guide them. Yet on many forums it’s a case of the blind leading the blind. A newbie thinks some silly SEO technique works and spreads it to the other newbies. Eventually one of the more enterprising younger SEOs writes the “Newbie Bible to Stupid SEO” and at that point what is said must be true (cuz it’s in the bible!).

Believing What You Read or Hear Instead of Figuring It Out for Yourself

This truly irks me to no end and is definitely one of the major causes for the spread of many a silly SEO idea. If something you read sounds credible, then by all means give it a try. But unless you see proof of it working with your eyes, then don’t believe it…even if the most credible person in the SEO world wrote or said it.

Mixing Up Cause And Effect

Another one of my pet peeves that has been common since the beginning of SEO time. Just because you changed the positioning of a word in your title tag and the next day you ranked one place higher in Google doesn’t mean that your change is what caused it. It may have, but it may not have. We used to joke on the High Rankings Forum that if you keep a cabbage on your monitor it will boost your rankings. Why not? It’s as likely as some of the silly SEO theories that are based on poorly drawn conclusions that mix up cause and effect.

They’re Set In Their Ways

We all know that people hate change. Many SEOs are no different. But just because a 1990s search engine could only index a certain number of kilobytes of information on a page (likely due to bandwidth constraints) doesn’t mean that today’s Google works that way. The search engines themselves have made huge strides over the years, and while the basics of making a great site will always remain the same, the mechanics of how to do that change often. So to the person who recently asked me if hand-coded HTML pages will rank better than dynamically generated ones, the answer is a definitive NO, even if it may have been true in 1996!