blog

Graffiti 1.1 now available

Graffiti 1.1 is now available

For version 1.1 we’ve added: 

  • Bulk comment management
  • Permissions functionality
  • Packages for easily sharing Graffiti plug-ins
  • Tools for easily importing from other blogging platforms, such as dasBlog
  • Fixed 30+ bugs
  • And more…

We also introduced a new 3-person license for $99. We heard from a number of small shops that they wanted to use Graffiti commercial-edition, but a 10 person license just didn’t make sense for them.

Questions, comments, or suggestions about the software? Join in the discussion. We love hearing from you!

CodeSmith 5.0 Beta now available

This release adds support for .NET 3.5; has some significant performance updates; along with a number of improvements to CodeSmith Projects. You view the entire list of additions/updates here.

We’ve also moved all the core CodeSmith templates into a project on Google Code: CodeSmith Samples

Below are some of the template updates that are in CodeSmith 5.0: 

  • .netTiers 2.3 Beta included
  • Updated NHibernate templates
  • PLINQO template improvements
  • NuSoft Framework templates included
  • More VB templates and samples than ever before!

That last bullet is significant for all the developers that have always asked for templates in VB. We’ve started converting a lot of the core templates to both VB & C#.

Read more and download the beta…

in.telligent 2008 – social computing conference

I’m happy to announce the 2008 in.telligent conference!

Our first conference, in 2007, was a bigger than expected success with attendees coming from all around the world to talk, network, and learn.

This year we’re planning bigger. Much bigger.

The conference will October 20th – 22nd (more details here). The biggest change is that this year Instead of a single technical track we’re creating 2 tracks along with a set of post-conference work-shops.

The technical track will focus on Community Server, SharePoint, Graffiti, and Harvest and is aimed at developers that want a deeper understanding of how these technologies can be used to built world-class solutions.

The non-technical track will focus on broader topics like: planning social computing solutions; measuring ROI; brand building; and other topics that will appeal to business owners, brand managers, and marketers.

We’ll also have some smaller hands-on post-conference work-shops with personal mentoring from our team.

It will be an exciting time to network and meet other like minded people interested in the social computing space.

While we’re not quite ready for registration, you can sign-up on the site now and we’ll email you when registration opens up!

Taming the email monster – Microsoft Outlook best practices

I receive upwards of 100-150 emails per-day and a lot of people have asked how I manage my inbox.

For email I use Microsoft Outlook 2007 Apple Mail.app and Microsoft Exchange. As much as I complain about Email it is a love/hate relationship.  Email is always the first application I open followed by a web browser.

No real innovation in email in a decade

In my opinion email presents such a great opportunity to innovate and I don’t get why that hasn’t happened.  Microsoft’s Outlook product is ripe for such innovation.

Common Questions

Do you use Clear Context?
No. While I believe it’s intent is to do much of what I prescribe it’s not nearly as aggressive as I like; although I know plenty of people that it works perfectly for.

What about SPAM and unsolicited email?
People are constantly worried about publishing their email address and getting SPAM. While it’s true that you may get more SPAM it doesn’t take a whole lot of work to get your name on a spammers list.

I’ve published my email address publicly now (rhoward@telligent.com) for years. It’s actually something I started doing as an evangelist when I was at Microsoft. I remember lots of my friends and co-workers at Microsoft being shocked that I would publicly give out my email address. Now it’s common practice.

As for getting unsolicited emails, people are pretty respectful of what is appropriate to email.

Email Tips & Tricks

My tips & tricks that I’ve picked up for dealing with lots of email really have all come from other people – I’ve just combined a lot of them together. They are surprisingly straight-forward and simple:

When it comes to managing my inbox I make use of Microsoft Exchange Rules, Folders, and Views extensively.

Tip #1 – Only mail to me shows up in my inbox
I communicate to everyone the I work with that I treat email where I’m on the ‘to:’ line and email where I’m on the ‘cc:’ line completely differently.

If it’s important and you want me to look at it put me on the ‘to:’ line, otherwise I don’t even promise to read it.

My mailbox has a folder called “__cc: Rob Howard” any emails that I am cc:ed on are moved by a rule to this folder. I’ll scan through emails that I’m cc:ed on several times a day but that’s about it.

In other words I’ll scan the subjects, read the ones that look interesting, and then delete everything or move interesting items to my “Sent Items” folder (more on that in a minute).

Only email that is sent directly to: me (and not from a distribution list or other automated source) ends up in my actual inbox. You would be amazed at how much noise this removes.

Tip #2 – An inbox for internal and an inbox for external
I don’t use this as diligently as I used to.  Mainly because I’ve found that good filtering and rules are good enough for controlling what comes into the inbox.  Of course it helps to have an assistant that can look through these emails as well.

Nevertheless, since this was part of the original post I want to share it:

Email that is sent to me from people in our Exchange Global Address List (GAL) and email that is sent to me from people not in the GAL is also treated differently.

If an email is to: me but comes from someone within the Telligent GAL it ends up in my main Inbox. If an email is ‘to:’ me but comes from someone not found in the GAL it goes to a folder called “_Inbox (Customers)“.

Again the idea here is to help prioritize what I’m looking at.

Note, the use of the underscores in front of folders is only to control the sort order.

Tip #3 – A rule for everything else
All other email, whether from an internal list such as our Telligent Product Discussion List, other external email lists such as those from Google, Twitter, Facebook, and just about anything else gets an Exchange rule to send it to a folder.

This keeps the noise from all the various discussion lists that I’m part of out of my inbox.

Keep yourself sane with email rules

I also use an explicit “Delete” rule. Some mail just doesn’t get stopped by Postini or gets flagged as junk. Those get moved by my “Delete” rule directly to the deleted items folder.

Tip #4 – Conversation View
Both Mail.app and Microsoft Outlook have a view option called “Conversation View”.

To enable Conversation View in Microsoft Outlook: right click on the header of the mail grid and select “Customize Current View” and select “Conversation”.

To enable it in Mail.app: select a folder, such as your inbox, and click View and select “Organize by Thread”.

Conversation view organizes threads together. So if a reply comes in for a thread that is already in your inbox you get to see all threads together.  Below is a screen shot from Mail.app that shows a threaded view for an internal discussion about Developer Reviews:

Rather than reading every message separately you can read all the messages together, delete older ones and only keep the most recent (something I do often), or just delete the whole thing.

Tip #5 – Read, Unread, Delete or Archive
Outlook has flags for setting different status information about emails it also supports follow-up notifications. I don’t use any of these. Instead all email exists in one of 4 states:

  • Read – Email that is marked as ‘read’ in my inbox is considered ‘completed’. It will either be moved to an archive or deleted. If I read an email and need to go back an take action on it I leave it’s state ‘unread’.
  • Unread – Email that requires action. Using this I can quickly glance at my inbox unread number at any moment and know that I have new actionable emails.  This is especially useful once you apply the other filtering rules because the unread count only applies to the inbox.
  • Delete – I keep mail for about 3-4 weeks in my “Deleted” folder anything older than that I delete every Monday.
  • Archive – see tip #6.

Tip #6 – Archive and Search
Anything that I send or anything that I want to keep I move to my “Sent Items” folder. Every couple of months I move everything from Sent Items to a backup PST file organized by year (each year gets a new PST file). I’ll even occasionally move other mail that I have read and want to save there too.

SEO & ASP.NET: Put keywords in the URL

See the first post in this series | Previous Post

Tip #3 – Put keywords in the URL, the sooner the better.

There are 3 documented areas that Google looks for keywords: URL, Title, and finally the body of your content. So if you are not embedding keywords into your URL then you are missing an opportunity to help increase the odds of your content getting pushed higher in natural search results.

How does Google find keywords in the URL?

Take the following 4 URLs (I’m also assuming you read the previous article about how you construct your links too):

  • example.com/seoforaspnetdevelopers/
  • example.com/SeoForAspnetDevelopers/
  • example.com/seo_for_aspnet_developers/
  • example.com/seo-for-aspnet-developers/

There are 4 total keywords ‘seo’, ‘for’, ‘aspnet’, and ‘developers’.

Which format is better? Or does it really even matter? Hopefully it comes as no surprise that the first 2 examples are virtually identical and in neither case, from what my research has shown, does Google use any casing information to pull out keywords. So the first 2 examples are very bad choices for how to format your URL if you care about SEO. In the 3rd example underscores are used to breakup the keywords and in the 4th example dashes are used.

Google has stated that the preferred way to break-up keywords in the URL is to use dashes. Most modern content management systems and blog engines use this as the preferred method.

The ordering of keywords matters

Furthermore my research has also shown that the order of your keywords also matters and the domain name is considered for keywords too. In Community Server and Graffiti we automatically build the title and URL of posts based on the subject of the post entry. Community Server goes one step further and allows you to control the URL independent of the subject. We’ll add this functionality into Graffiti soon as well. The reason being that controlling the ordering of keywords in the URL matters too.

URL Rewriting

If you care about how your links are built and that you always ensure there is only one way to get to your content and you also care about the ordering of keywords then you likely care a great deal about URL rewriting: the ability to use a published URL that may not be the same URL that the application requires internally.

URL rewriting allows you to take a URL like:

  • example.com/posts/default.aspx?postid=34

and publish it as:

  • examples.com/seo/aspnet-seo-optimization-with-url-rewriting/

There are several different techniques for URL rewriting for ASP.NET and this blog post is certainly not going to attempt to address them all.

Simple URL Rewriting

The first technique for URL rewriting is very simple and simply tries to take advantage (game) the path parsing of a crawler. This technique uses a controller through which all requests are sent through and works best for cases where you are hosting the server and do not have the ability to run an ISAPI filter to rewrite URLs (or have access to IIS 7):

example.com/33.axd/seo/aspnet-seo-optimization-with-url-rewriting/

— or —

example.com/aspnet-seo-optimization-with-url-rewriting.aspx

The latter technique is how Community Server constructs URLs, as well as other .NET blogging engines. The first technique is something new that we’ve been experimenting with and tried first with the Telligent Wiki Prototype that runs docs.communityserver.com and wiki.asp.net (and a few other sites).

In the first example there is an HttpHandler that looks for all requests that use the .axd extension (note any extension type will work). The handler parses the path of the request but only cares about the identifier – in this example the key is 33 – that allows it to pull the content from the database.

The second case is again a virtual handler and loads up the content based on the name of the post.

The difference or benefit from either technique is unclear. However, I suspect that the first example where slashes are used for paths will likely work better for SEO purposes. But that is conjecture.

Advanced URL Rewriting

If you have control over the server or have a more progressive host there are some other options to consider for more advanced URL rewriting.

In Graffiti we actually create files and directories to give users full control over the path vs. virtual URLs as used in Community Server. This has both some benefits and some pitfalls. The benefit is you get very clean paths with no extensions in them. The pitfall is that it does require permissions to write to the disk.

For example, a post titled “ASP.NET SEO Optimization with URL Rewriting” in a category called “SEO” would create:

  • [path to Graffiti application]\seo\aspnet-seo-optimization-with-url-rewriting\default.aspx

The URL would then be published as:

  • example.com/seo/aspnet-seo-optimization-with-url-rewriting/

This obviously works very, very well. The default.aspx page internally can store all the details, such as the post id, to quickly look up the post in the database.

Another option is to use a URL rewriting library like ISAPI Rewrite, which happens to work very similar to Apache’s mod_rewrite. This is an ISAPI filter, $99 well spent, that allows to fully control all URLs for your application.

The future: IIS7

Unlike previous versions of Microsoft’s Internet Information Server, IIS7 will allow for ASP.NET HttpModules to perform exactly the same tasks as ISAPI Filters. This means that you could write an HttpModule for handling all your application’s URLs (similar to ISAPI Rewrite) all with .NET code!

Furthermore this also means you can do things like use ASP.NET Cookie Authentication to authenticate access to any resource (images, html page, etc.). Something that isn’t easily accomplished today. Taken one step further: this also means that you could have .NET code authenticate requests that were served from a PHP application running in IIS!

If you want to read more about URL rewriting for ASP.NET check-out Scott Guthrie’s article

Next Tip: Titles & MetaTags (not yet written)

SEO & ASP.NET: How content is linked really does matter

See the first post in this series

Tip #2 – How people link to you really does matter

Have you ever taken the time to look at all the different ways you link to your content? For ASP.NET developers there are typically 3 ways to link to the default page (usually default.aspx). For example, let’s say there is a landing page in your site for “Products” under the directory /products with a default.aspx page. Therefor you could link to it as:

  • example.com/products
  • example.com/products/
  • example.com/products/default.aspx

Which is the right one to use? This largely depends on what you are doing. My personal preference would be to load up the URL with keywords (more on that in a future post) and to not show the page with the extension.

In my opinion the right choice is:

  • example.com/products/

Why does this matter?

In Google’s eyes example.com/products and example.com/products/ (note the trailing slash) are 2 different pages! In fact, example.com/products/default.aspx is considered to be a different page as well!

So let’s say you’ve written a brilliant piece of content and people are linking to it like mad (good for SEO). You’re on Digg, cnn.com, and even Mr. Guthrie covered you! Let’s say there are 1,000 total links to your post – you are golden right? Well, maybe not.

If 25% of the people linked to example.com/products, 50% linked to example.com/products/, and the remaining 25% linked to example.com/products/default.aspx you would not be maximizing your incoming links. Instead you would be supporting 3 different pages of the exact same content. The number of external incoming links is very important for SEO and if you don’t control how people are linking to your content you are reducing your ability to get higher in natural search results.

This can get even worse if you are serving content for 2 domains as discussed here.

How to solve this?

The good news is that solving this is fairly straight forward using HTTP 301 status codes.

A HTTP 301 status code tells the browser to redirect to another URL and that the redirect is permanent. Another status code, 302, is used by ASP.NET developers all the time (using Response.Redirect which we’ll cover in a future article) also redirects to another URL but says that the redirect is temporary.

The difference between a 301 and 302 is not noticeable by people browsing a site, but for search engines its meaning is very important: a search crawler will follow a 301 but will not follow a 302.

The recommendation would be for ASP.NET developers to use something like an HttpModule to examine all incoming requests and examine the path of what is being requested. If a request is for /default.aspx or for a page that doesn’t have a file extension the HttpModule could short-cut the request and send an HTTP 301 back to the browser redirecting to the right URL:

Response.RedirectLocation = “/products/”;
Response.End();

Yes, this also does mean that for every incorrect URL requested you’re going to ask the browser to fetch a new URL (2 requests to the server). But the good news is that when search crawlers find the links – no matter the format – they will be given the right link through the 301 redirect and ensuring that you aren’t showing different pages for the same content.

What about preventing search engines from accessing certain parts of your site? You can use a robots.txt file as well as a META tag with a noindex option on individual pages. Just be careful about what you include in robots.txt – one of the first places a hacker can look to see what you are trying to hide is your robots.txt file.

Next Tip: Put keywords in the URL

Search Engine Optimization (SEO) for ASP.NET Developers

I recently put together a presentation for a developer conference about SEO for ASP.NET Developers. I was a little surprised at how little content there was when I researched this topic. There was a lot of great content about SEO, but only a handful of articles for developers. I’ve decided to take my talk and convert it into a series of blog posts about SEO for ASP.NET.

As it turns out about 50% of it actually requires technical understanding of ASP.NET – a lot of SEO goodness can be achieved without knowing a thing about the technology your site is written on.

A Primer

Search Engine Optimization is the process of increasing your natural search rank on sites like Google. A lot of people and companies spend thousands of dollars on AdWords (and other ad options) to ensure that their content is shown when people use specific keywords looking for products or services.

Search Engine Optimization is difficult for developers for a number of reasons, but probably the number one reason being that the technology wasn’t specifically designed to address the challenges in SEO. For example, ASP.NET 1.0 was built and released before SEO became something that people concerned themselves with.

The first recommendation that we give customers is to plan early for SEO, much like we advise for planning for performance/scale. Similar to planning for performance & scale, planning for SEO of your site requires some decisions up-front about how your site’s information architecture is going to be laid out. One of the suggestions I read said that, “while most people plan for FireFox and Internet Explorer they tend to forget about the 3rd major browser: search engines.” This makes a lot more sense once you realize that all the compelling functionality you can enable with JavaScript libraries, Flash, and now SilverLight may be moot if search engines can’t access the content.

The good news is there is plenty of great recommendations and tools to help you with this as well as some strategies for dealing with sites that can be difficult to index. A little planning can go a long way.

Tools of the Trade

While there are several tools that I’d recommend you add to your developers belt there are 3 that I’ve found to be helpful for SEO purposes:

imageFireBug
FireBug is an add-on for FireFox and is incredibly handy for a number of tasks. One of which is to get a quick sense of what your pages are rendering to the browser and how much content you are sending back.

In the screen shot to the left I’ve opened the Graffiticms.com site and am examining the size of various elements that are downloaded when the page is requested.

imageGoogle Webmaster Tools
Google provides a number of great tools for helping you with SEO. One of which is a suite called Google Webmaster Tools.

Once you setup your site so that Google Webmaster Tools can inspect it you will get back a lot of data about crawl analysis, various site statistics, ranking information, as well as some suggestions for improving your site’s indexability.

 

imageFiddler
Fiddler is wonderful tool that allows you to monitor the traffic that goes out from your web requests and what the server actually sends back.

When running Fiddler sets itself up as a proxy in Internet Explorer and traps all incoming and outgoing HTTP requests. This provides insight into the various HTTP headers being sent back-and-forth as well as the ability to view the raw HTML that the server is generating.

 

Tip #1 – Beware of Duplicate Content

If you think that creating multiple pages in your site with the same content – or publishing the exact same content multiple times through other sites is good – you would be mistaken. While search engines such as Google don’t necessarily always consider this gaming (although it has been known to be done for this purpose) duplicate content only ensures that there are multiple places where the content can be found and takes away the uniqueness of the content and decreases your natural search engine rank.

How many domains are you supporting?

An example of this that most people probably aren’t aware of is with their own domain. For example, www.example.com and example.com – while most of us would likely view these 2 domains as ‘identical’ search engines view them as 2 separate sites. So if you create content and search engines can access it either with or without the ‘www’ you are decreasing your uniqueness and essentially duplicating content.

In Community Server, we actually built in functionality starting in version 2.0 that would automatically force a Community Server site to run on a single domain. By default we would always try to strip ‘www’ off the front of the URL – trying to help customers get better SEO for their sites. As it turns out this is also one of the number one questions people ask in our support forums, “Why is Community Server removing www…” For the record this is completely configurable and you can set Community Server to leave the www in place on the URL.

The good news is that it’s easy to resolve the issue of multiple domains serving the same content. Internet Information Server has a built in ability to handle this. For example, if your primary site is “Example.com” and you want to make sure that all requests to www.example.com are redirected to example.com (stripping the www):

imageCre
ate a Permanent Redirect in IIS

1. Open IIS and create a new website, e.g. “Example.com www redirect” that serves request for the same IP address as your primary site, but only for the host header of www.example.com.

2. Next, configure the website to perform a permanent redirect.

Now, when requests are made to www.example.com the server will permanently redirect all requests to the domain with the ‘www’ removed.

What about RSS?

Here is something you may not hear that often: beware of RSS syndication. RSS is a wonderful technology for enabling information sharing. People typically share content through RSS either by publishing all their content or publishing excerpts.

With an excerpt you are only sharing a portion or summary of the main content and typically require people to click-through to get the full content. Sites like cnn.com typically syndicate excerpts of their main stories.

Excerpts work, but readers – at least more savvy web users – don’t like them as much. It usually means they have to leave their RSS reader to read the content on the website.

On the other hand most bloggers, and even some other news sites, publish all their content and don’t use excerpts. For example, on both weblogs.asp.net and blogs.msdn.com both sites are configured to publish all content in the RSS feed. The problem is that SPLOGS take advantage of this.

A SPLOG is nothing more than an automated blog that is publishing content by subscribing to another sites RSS feed. The owner of the SPLOG then sets up Google AdWords, or another monetization option, and uses the content created by the primary site to help drive up their natural search rank. The goal being that they hope people find the SPLOG and click on ads.

We see this all the time for content created on weblogs.asp.net. While I’m not advocating using excerpts, you do need to realize that by publishing a full RSS feed you may be publishing your content (such as the content I’m writing for this blog post) in more places than you realize!

Next Tip: How content is linked really does matter

Enough with the Segways already

Sure I want a Segway, it’s a novelty and would be something fun for our office. But that’s it: fun.

Traveling around the country I’m starting to see security in airports, such as DFW, use Segways as a way to move around quickly. They used to ride bikes, but for some reason a motorized transportation system that is 5-7x as expensive than a bike is now better. I’m also now seeing them at hotels. At 2 conferences in the last several weeks that I’ve been to (Westin Palm Springs and Marriott in Orlando) security were riding (or is it driving?) Segways. This morning I had to move out of the way as someone who worked for the hotel rode past me in the hall.

Something I’ve always enjoyed about visiting Europe and other dense metropolitan cities like New York is that you actually can walk everywhere. I spend a lot of time sitting: sitting behind a desk, sitting in a car, sitting on an airplane, etc.

But I guess walking is passé. So please say hi to me when you pass me on your Personal Transporter (and try not to run me over next time) I’ll be the weird guy that still walks places.

Community Server 2008 Now Available!

Early this morning (at about 3:45 AM Central Time) we launched the new communityserver.com web site along with officially releasing Community Server 2008!

The Community Server 2008 release includes:

  • Enterprise reporting and analytics to help dissect, analyze, and trend user and community behavior.
  • Deep integration with Microsoft Exchange, Active Directory, and Microsoft Office SharePoint Server.
  • For developers, Community Server 2008 includes a complete Web Services (REST) API for easily integrating and extending the platform.
  • Social streams enable people to quickly see what friends or others in the community are actively contributing to.
  • Robust media gallery for sharing content published in the community or from external sources such as YouTube, Flickr, and more.
  • Enterprise file storage enabling both local storage as well as integration with services such as Amazon S3.
  • Widgets for easily sharing data between applications. Included widget support for Google, and more.
  • On demand groups / social circles make it simple for intranets and even large-scale public communities to quickly form small micro-communities.
  • Community Server 2008 is one of the first software platforms to include built-in support for OpenID.
  • Community Server 2008 includes many improvements to existing features such as multi-user blogging tools, robust message boards, and person-to-person messaging.

You can read more about the new version of Community Server at communityserver.com.