Category Archives: actionable

Can Pinterest Work For You?

Pinterest.  It’s the latest thing in social networking.  Going to the Pinterest site fills you to the bursting point with eye candy.  You see recipes, beautiful vacation escapes, cute shoes, and crafts you want to try.  Ok, maybe that’s just me.   But the site is filled with wonderful pictures of interesting things that you could spend hours perusing.

But can Pinterest work for your business?  We are at the commencement of the Pinterest era, and people and small businesses are just beginning to harness the power.  Here are some things to think about when trying out Pinterest for your business.

Pinterest links are currently no-follow.  That means that if a person comes to your site through a Pinterest link, it will not move up the search engine ladder.  However, getting any traffic to your site is a good thing, right?  Make sure that you have a great landing page—one that will engage the potential customers and encourage them to look around your site.

Pinterest is filled with pictures.  Do you have a bunch of pictures on your site?  Are the pictures on your site ones that will get people to re-pin and therefore advertise your business?  The Digital Mountain Consulting site has a few pictures, but they are not necessarily engaging enough to be repined.  Maybe Pinterest is not right for DMC, or maybe we need to work on this…  But if you have a site that sells handmade purses, or hard to find musical instruments, Pinterest might be the perfect way to promote these.    You can pin the pictures from your on-line or Etsy store, and drive business to your product.  Or maybe you have a blog that goes along with your sale products.  You can pin the pictures from the blog, and that would naturally drive people to your website without the big sell.

Actionable Analytics — Part 3

I did it.  I packaged into a video not only the steps from Actionable Analytics — Part 2 but also some insights to help you interpret your Google Analytics (GA) traffic with your crisp new Custom Segment turned on.

You’ll see how focused your GA review time can be with the right goals setup as custom segments.  You’ll know which geographies, referrers, search keywords, content, etc is producing results on your website.  Without delay:

Actionable Analytics – Part 2

It’s time to make your website analytics actionable!  From my last post, Actionable Analytics – Part 1, you should have:

  • Google Analytics installed on your website
  • Created a goal for  Engagement
  • Created a goal for Conversion

If you don’t know what I mean please visit Actionable Analytics – Part 1 to catch up.

The key to knowing what components of your website are producing for you is to look at visits on your website that reach the goals you’ve set.  This is done by creating an Advanced Segment in Google Analytics (GA for brevity) which, when selected, will filter out all the traffic that does NOT reach your goals.

Step 1: Login to your GA account, click “Advanced Segments”, and “New Custom Segment”.  It should resemble:

 

Clicking “New Custom Segment” leads you to:

 

Where you will select a previously created goal in the drop down selector right after “Include” and set the threshold — I like “Greater than 0″.  By setting your advanced segment to show visits and visit data where a goal is >0 will show you visits which reached the goal.  Once you “Save Segment” you’ll be able to select this segment of your traffic when viewing your analytics.  Then everything you see, so long as your segment is selected,  will be activity which occurred during visits which reached your goals!

This is BIG.  You can now locate the Top Geographies, Top Content, Top Referrers, Top Keywords, and any other Top “Dimensions” or “Metrics” you care to consider!  Now go look over your data with a new focus: Find out what’s working and DO MORE OF IT!

To make this process easier to follow this post needs either 1) a lot more pictures or 2) a video.  I’ll get right on that.

Look for Actionable Analytics – Part 3 to contain a video overview of this process.

Actionable Analytics – Part 1

Google analytics (GA) is one of the most powerful tools you’ll find to glean insights into the operation and marketing of a website/mobile app/web app.  However, you’ll find that when you’re looking for the big insights they’re not easy to find.  It takes a knowledge of how to use GA to extract the gems to really build something great.  No surprise there.

In this series of articles I’m going to share with you our approach to clearing away the less fruitful pursuits in your web analytics and focus you on the ones which will yield the biggest insights with the minimum effort.

Install Google Analytics

follow the steps to install Google Analytics on your website at http://www.google.com/analytics/ (Little plug: We do that for no additional cost when you buy either our Search Presence or Search Light services.)

Define goals

The starting point for all good decision making is to decide what is a good thing.  It is no different with your website and advertising.  We always create two “Goals” in Google Analytics.  The first represents a good thing that happens fairly often — we always refer to this as “Engagement”.  Then we create the goal which represents the closest thing your website has to a sale and call this “Conversion.”  We’d always dig in to find the best choice for each of these goal definitions, however for the sake of example let’s say your website doesn’t have any ecommerce but you’re very happy to get sales leads on your site through your contact form.  In this case we’d setup these two goals in GA:

  • Goal 1
    • Give it a name like “Contact Engagement”
    • Type: URL destination
    • Goal URL: /contact (or whatever the path to your contact page is)
    • Match type: regular expression (this allows the URL to match if you have /products/contact, /services/contact, /corporate/contact)
    • Give it a value if you’d like to keep track of the value of these interactions
  • Goal 2
    • Call it “Contact Conversion”
    • Type: URL destination
    • Goal URL: /contact/thankyou (use your actual page address)
    • Match type: regular expression
    • Set the value.  This should be significantly higher.    Engagement is a precursor to the truly valuable interaction — completing the sale

So now you’re off to a good start.  Google will keep track of the goals on your website going forward.  You’ll now be able to see the engagement and conversion taking place on your site.

Truly, this is only the beginning though.  When your reports show you referrers that generate engagement you’ve really got something.  When you know which geographic regions generate engagement you’ve got a little more.  Identify the content on your site that is engaging and BAM!  you’re getting closer.  Keywords that generate engagement are the icing on the cake.  Of course, you need a custom report for all that.  I’ll save that for Part 2.

We wrap all this up in our Search Light service where you don’t have to look at any code or carry out these steps yourself.  Better yet, the report comes to your email as a spreadsheet with exactly what you need to know!

Reporting on Google Analytics and Adwords in Python, Part 2

My previous post gave you some working code to get you started extracting Google analytics data from the API.  While visits and pageviews are important metrics they don’t mean much without context.  In this post I’ll extend the code from the previous post to show the visits and pageviews for each referrer to a site.  That’ll put some meat on these bones!

First, I’ll focus on just the creation of the query URI.  In part 1, our query URL looked like:

query_uri = gdata.analytics.client.DataFeedQuery({
      'ids': PROFILE_ID,
      'start-date': sd,
      'end-date': ed,
      'dimensions': 'ga:date',
      'metrics': 'ga:visits',
	})

and had the output:

ga:date ga:visits
20111121        3214
20111122        2692
20111123        2360
20111124        1537
20111125        2227
20111126        2171
20111127        2220

A query asks the Google Analytics API for all metrics for each dimension specified. In this case the query asks for all “ga:visits” for each “ga:date” dimension which are recorded over the date range from “start-date” to “end-date” for the “ids” specified.

Here’s a new query to show visits and pageviews for all referrers each day inside the date range:

 query_uri = gdata.analytics.client.DataFeedQuery({
      'ids': PROFILE_ID,
      'start-date': sd,
      'end-date': ed,
      'dimensions': 'ga:date,ga:source',
      'metrics': 'ga:visits,ga:pageviews',
        })

And its output looks like:

ga:date ga:source       ga:visits       ga:pageviews
20111121        (direct)        1029    7806
20111121        ask     8       44
20111121        austin360.com   1       12
20111121        bing    133     1130
20111121        en.wikipedia.org        2       40
20111121        facebook.com    2       9
20111121        google  1550    10642
20111121        google.com      6       27
20111121        home.myhughesnet.com    1       2
20111121        yellowbook.com  3       15
20111121        yellowpages.com 1       5
20111121        yelp.com        1       9
... many entries eliminated to protect the innocent
20111122        (direct)        844     4728
20111122        aol     5       11
20111122        ask     4       32
20111122        yahoo   124     1005
20111122        yellowbook.com  2       10
20111122        yellowpages.com 3       31
... many entries eliminated to protect the innocent
20111123        (direct)        793     4256
20111123        aol     9       62
20111123        ask     3       75
20111123        bing    118     867
20111123        yelp.com        1       7
... many entries eliminated to protect the innocent
20111124        (direct)        475     3526
20111124        aol     4       31
20111124        ask     5       37
20111124        bing    60      389
... many entries eliminated to protect the innocent
20111125        (direct)        676     4041
20111125        aol     6       42
20111125        ask     7       44
20111125        bing    99      588
20111125        yahoo   130     1152
20111125        yellowbook.com  3       14
... many entries eliminated to protect the innocent
20111126        (direct)        664     3524
20111126        aol     5       35
20111126        ask     9       34
20111126        bing    76      599
20111126        sxsw.com        7       17
20111126        yahoo   138     1002
20111126        yellowbook.com  5       18
20111126        yellowpages.com 1       19
20111126        yelp.com        2       48
... many entries eliminated to protect the innocent
20111127        (direct)        636     3507
20111127        aol     7       127
20111127        ask     7       67
20111127        bing    84      548
20111127        en.wikipedia.org        1       7
20111127        facebook.com    2       12
20111127        google.com      10      72
20111127        m.yp.com        2       5
20111127        mail.aol.com    1       4
20111127        yahoo   127     1074
20111127        yellowbook.com  3       10
20111127        yellowpages.com 1       1
20111127        yelp.com        2       38

 

Drop the date dimension and you get all referrers in your date range with no date grouping by the “ga:date” dimension. The query:

query_uri = gdata.analytics.client.DataFeedQuery({
      'ids': PROFILE_ID,
      'start-date': sd,
      'end-date': ed,
      'dimensions': 'ga:source',
      'metrics': 'ga:visits,ga:pageviews',
        })

It’s output:

ga:source       ga:visits       ga:pageviews
(direct)        5117    31388
aol     50      386
ask     43      333
bing    677     4705
bookmarks.yahoo.com     2       6
business.com    1       7
google  7937    53270
mws.ask.com     1       10
my.msn.com      1       41
my.yahoo.com    1       3
sxsw.com        33      89
yahoo   875     6789
yellowbook.com  25      104
yellowpages.com 7       68
yelp.com        6       102
... many entries eliminated to protect the innocent

 

Both of those outputs are valuable. First, the referrers, their referral volume, by date is quite valuable. I’d expect to use the second version to get an overview of aggregate referreral volume without the date segmentation to get a feel for a site’s referral health. I’d use the first version to find out when the referrals were delivered.

Here’s the full code for the Python program that produced the second output:

#!/usr/bin/python

'''

Use the Google Data python module to query Google Analytics

You'll get the "PROFILE_ID" from your Google Analytics account.  From the default
listing you click on the account name and then take the "Edit" action on the website
profile you want to find the profile id for.  It is listed near the top right under
"Profile Settings"
'''

__author__ = 'Leo Edmiston-Cyr '

import gdata.analytics.client
import datetime

USERNAME = 'yourAnalyticsAccount@gmail.com'
PASSWORD = 'youToughPassw0rd'
PROFILE_ID = 'ga:1234567' # the GA profile ID to query
SOURCE_APP_NAME = 'GAGettah' # anything you want to call it
sd = datetime.date(2011,11,21)
ed = datetime.date(2011,11,27)
#ed = datetime.date.today()

def main ():

    my_client = gdata.analytics.client.AnalyticsClient(source=SOURCE_APP_NAME)
    my_client.client_login(
        USERNAME,
        PASSWORD,
        SOURCE_APP_NAME,
        service='analytics')

    query_uri = gdata.analytics.client.DataFeedQuery({
      'ids': PROFILE_ID,
      'start-date': sd,
      'end-date': ed,
      #'dimensions': 'ga:date,ga:source', # enable grouping by date
      'dimensions': 'ga:source',
      'metrics': 'ga:visits,ga:pageviews',
        })
    feed = my_client.GetDataFeed(query_uri)

    # find out if this is the first run through the results
    # to build a simple header for the dimensions and metrics
    firstRun = True
    heading = []

    # we'll run through the data feed reutrned from our query
    for entry in feed.entry:

      # build each row of data from the feed
      row = []

      # pull all dimensions out of this entry
      for dim in entry.dimension:
        if firstRun:
                heading.append(dim.name)
        row.append(dim.value)

      # pull all metrics out of this entry
      for met in entry.metric:
        if firstRun:
                heading.append(met.name)
        row.append(met.value)

      # print the dimension and metric names as the header
      if firstRun:
        print "t".join(heading) + "r"

      # print all rows from the feed as they are built
      print "t".join(row) + "r"

      # don't print the dimension and metric names as the header again
      firstRun = False


if __name__ == '__main__':

        main()

Want more dimensions and metrics? Visit the GA API dimensions and metrics reference page http://code.google.com/apis/analytics/docs/gdata/dimsmets/dimsmets.html

These articles were aimed at the technical web marketer who either writes a little code or has a programmer they want to egg on.

Reporting on Google Analytics and Adwords in Python, Part 1

Caution: This post is heavy on the geek.  You’ve been warned!

Here is the simplest example worth trying to get you started

#!/usr/bin/python

'''

Use the Google Data python module to query Google Analytics

You'll get the "PROFILE_ID" from your Google Analytics account.
From the default listing you click on the account name and then
take the "Edit" action on the website profile you want to find
the profile id for.  It is listed near the top right under
"Profile Settings"
'''

__author__ = 'Leo Edmiston-Cyr '

import gdata.analytics.client
import datetime

USERNAME = 'yourAnalyticsAccount@gmail.com'
PASSWORD = 'youToughPassw0rd'
PROFILE_ID = 'ga:1234567' # the GA profile ID to query
SOURCE_APP_NAME = 'GAGettah'
sd = datetime.date(2011,11,21)
ed = datetime.date(2011,11,27)

def main ():

    my_client = gdata.analytics.client.AnalyticsClient(source=SOURCE_APP_NAME)
    my_client.client_login(
        USERNAME,
        PASSWORD,
        SOURCE_APP_NAME,
        service='analytics')

    query_uri = gdata.analytics.client.DataFeedQuery({
      'ids': PROFILE_ID,
      'start-date': sd,
      'end-date': ed,
      'dimensions': 'ga:date',
      'metrics': 'ga:visits',
	})
    feed = my_client.GetDataFeed(query_uri)

    # find out if this is the first run through the results
    # to build a simple header for the dimensions and metrics
    firstRun = True
    heading = []

    # we'll run through the data feed reutrned from our query
    for entry in feed.entry:

      # build each row of data from the feed
      row = []

      # pull all dimensions out of this entry
      for dim in entry.dimension:
        if firstRun:
                heading.append(dim.name)
        row.append(dim.value)

      # pull all metrics out of this entry
      for met in entry.metric:
        if firstRun:
                heading.append(met.name)
        row.append(met.value)

      # print the dimension and metric names as the header
      if firstRun:
	print "t".join(heading) + "r"

      # print all rows from the feed as they are built
      print "t".join(row) + "r"

      # don't print the dimension and metric names as the header again
      firstRun = False

if __name__ == '__main__':

	main()

Here is its output

ga:date ga:visits
20111121        3214
20111122        2692
20111123        2360
20111124        1537
20111125        2227
20111126        2171
20111127        2220

 

In the next post I’ll add some dimensions and metrics.

Why oh why EC2 do you make me blue?

Amazon EC2.  I love it!  I hate it!  It frustrates and frees me.  It is many things.  However, EC2 fits into my business in an unusual way.  I need to host websites to facilitate my clients’ goals.  To do that I need servers and I need many of them.  I’m a veteran Linux admin, I’ve written more than my fair share of shell scripts, and managed NOCs and engineers seperated by hundreds and thousands of miles so I don’t think I’m asking for too much that this service would be easy to use.

These days I rely on simplicity to ensure that I can squeeze my server administration tasks into my spare time.  However, as my free time shrinks and my server numbers grow (slowly, but surely) I find EC2 less of an ally.  I’ll give you a couple of examples:

Example 1:

I’m building a server to replace a busy local forum.  I want to start my newest AMI for the running server.  Wait a minute… I have data backups but I don’t have all the newest changes built into a bootable AMI.  Ok.  I’ll just make an ami then boot it.  Easier said than done.

I know there is a sequence of steps I can carry out with the HTTP API for EC2.  I don’t really have the extra time to code that up and make sure it works like expected.  So I pop over to the EC2 management console.  No dice.  If I make an AMI without stopping the running instance first it stops it for me.  Did I mention the management console doesn’t warn me about that?  Bottom line:  Using the web interface to EC2 management I must disrupt service for my otherwise stable server.

Of course I could build a more robust cluster however do you think I have time for that if I’m spending most of my time each week consulting and writing?  I just want to turn on a server, back it up, keep track of which data is associated with it and easily launch a new copy of it.  In the modern cloud environment that doesn’t seem like a tall order.  Does it?

Example 2:

I’ve managed the downtime I needed to make a fresh AMI of my running server.  I’m making changes to the way it works, the way it boots, and configs for various daemons.  To ensure I’ve got a good config I need to reboot the server.  This tells me definitively if a new server started from my, underdevelopment AMI, will boot correctly with all the needed services and no unneeded services.  Rebooting…

Darn, now I can’t connect to the database server.  Oh!  That’s right.  Every time I reboot or start and stop an instance it gets a new private ip address.  No biggie; right?  Wrong.  The private IPs are the ones used for on-amazon connections.  This means I have to go to my development server and get its new private IP, find the security group for the database server, and allow the development server’s new private IP.

Whew!  Honestly, that’s just the start of how amazon makes NOC/server management so easy AND so complicated.

I might not care so much but I have probably 10 sites I’ve written on Google’s App Engine.  That is as close to effortless as I can imagine.  I’m not in love with the degree of vendor lock in there.  Nor can App Engine solve all problems.  Not by a long shot.  However, it has its place and does what it does with extreme simplicity.  Thank you Google.

So, recently I’ve been looking for a successor to EC2.  I’ve hear about rackspace for years and I thought “Maybe just one or two physical servers could simplify this?  Maybe the virtual thing isn’t necessary for my tasks?”  Then I visited http://www.rackspace.com/cloud/cloud_hosting_products/servers/technology/ and found that they seem to have a very competitive cloud sever environment.  Reading their tech specs the server image creation might be easier.  Backups might be easier to manage.  I’m dying to try it!  I just signed up and I’m in the management console now.  Wish me luck!

PS I’ll give you one guess what my next post will be about.  Know what it is? ;)

PSS I’m not getting anything from rackspace to write this.  They look like a good alternative and I’m excited!