Brain Vibe

marketing muses to stay engaged

Does Data Quality Matter in Social Media?

Data driven marketing is reliant on high quality data, but with the introduction of social media and its pervasiveness in the marketing tool kit, it is easier to engage with your market without having to have correct emails, addresses, or profiles. It begs the question, does data quality matter anymore for marketing in a Web 2.0 world?

I think the answer is, “Yes, but…”

Direct marketing and bottom of the funnel mindset is what most B2B marketers work in as they have been more closely ties to sales goals.  Where sales won’t accept a lead without knowing who it is and the appropriate contact information at a minimum, it has to be collected at every opportunity.  Without this information, marketing also doesn’t have an adequate single view of the customer to profile and segment reliably.  In this context, data quality is critical as it determines if a lead is passed, how to pass the lead, and align the lead to existing opportunities or account profiles.  Name, company, location, phone, and email are the cornerstone to this.

Social media is not outreach, it is in-reach.  It isn’t lead generation, it is relationship generation.  You don’t collect details on your connections and contacts.  You cultivate engagement and conversation.  Without the need to maintain a list of connections in your CRM and the ability to leverage social media organizers like HootSuite to communicate to your community, contact information is somewhat irrelevant.

So, where is data quality necessary?  Having a single customer view that is inclusive of social media profiles and engagement. At some point, us B2B marketers do need to move relationships out of the 2.0 world and into face to face engagements, particularly for complex sales.  At this transition point, the social media profile becomes an invaluable part of the customer view.  Just as CRM captures order transactions, direct marketing interactions, and sale interactions, it also needs to show social media interactions.  Why? The social media interaction is probably more telling of your relationship with your customers than traditional interactions.

The catch? Linking a limited profile from LinkedIn, Twitter, or Facebook to a standard contact profile in CRM can be problematic.  Your CRM system may not have the ability or capability enabled to link the 2.0 world with your customer data. You may not have a social media platform that is capturing what is needed to integrate your customer data between online and CRM.  Or, it does, but integration needs to be established.  Those are just a few examples.

Ultimately, data quality will matter for social media as B2B marketers mature in their use and linkage of 2.0 activities to best practices for lead creation, nurture, and pipeline generation.  We live for now in customer relationship silos, but the real advantage and benefit of social media to show ROI for marketing will be improved integration and profile management across the entire relationship.  As soon as integration is introduced, just as in the past, data quality plays a critical role.

Advertisements

Filed under: b2b, CMO seat, crm, data quality, marketing technology, social media, , ,

Data Governance More Than Ownership

When kicking off data management initiatives a large and key component is establishing the data stewards that represent the data that is collected, managed, and leveraged in business intelligence.  By having these data stewards, and subsequently a data management committee, companies feel safe that the proper data governance practices are going to be put in place.  Not so.  Ownership (=Stewardship)  does not equate to governance.

Many factors contribute to governance and business boundaries can quickly be broken down if you approach governance in business silos.  As you walk through your process of data collection you’ll quickly find that what is considered the preferred source of data may not be generated by the team that determines what should stay, what should be modified, and what should go.  In fact, depending on how you view the data, conflicts arise as to what is considered accurate, appropriate, of the contributing factor in decision and business point of view.

This is something I’ve run into recently when building a business intelligence solution for web analytics.  Even within my own department of advertising executives, views of what transactional data should be considered the record of source is up for grabs depending on who is the recipient of the information and how it is used.  Levels of accuracy vary depending on when data is needed, how it may be used for marketing optimizations, or if it will be used to actualize spending for billing.  Throw into the mix that data feeds coming from vendors are constantly changing as they actualize transactions over the course of days, weeks, and even months, and finding the truth in the data becomes a challenge that defies religious opinion on the subject.

Sorting through the challenges of governance to determine what makes data reliable requires looking at a variety of factors and allowing for multiple views and uses.

  • Reliability of source
  • Time of collection
  • Actualization
  • Business process affected/use of data in decisions
  • Degree of accuracy required

If you will notice, I do not include ownership.  This is the artificial governance.  Ownership in establishing governance only serves to create a framework around the above factors that creates credibility.  Ownership, and then the transformation to stewardship, serves to continuously monitor, enforce, and improve governance around data needs.

Start your data management off on the right foot, don’t confuse ownership with governance.

Filed under: business intelligence, data quality, , , ,

Ensuring quality data from service providers

For those of us that have lived, eaten, and slept with data quality and data management it is hard to fathom that there are still pockets of those that have yet to define a solid foundation of data quality and data management best practices.  It is even harder still to take a step (leap) back into the roots of how data quality and data management issues all began.  Well, let me tell you, those pockets of organizations are alive and well in the most unlikely places – those companies that are providing data.

To be fair, there are some amazing companies out there that provide information and data that we use to improve and enhance our own data or take to analyze independently.  They may not be perfect (no one is!).  Though, they have defined themselves as servicing organizations with “better quality data” and stand by it with best practices of their own.  But, as enterprise organizations and even mid-sized companies have jumped on the band wagon and adopted sophisticated processes, solutions, and people that are dedicated to better information, there are still a significant number of services providers that lack the skills, tools, and practices that would ensure reliable information to measure our performance, understand our market, and take advantage of new opportunities.

At the end of the day, the data and information we source needs to be reliable.  It is important to guard yourself when both contracting with service providers and when you receive data.  Simply relying on the fact that the data is of high quality when you receive it is not good enough.  You need to be vigilant during the sourcing of providers as well as clearly defining how you can ensure what you received is what you paid for.  Here are some things to consider and ask when working with data providers:

  • How do they collect their information?
  • How do they verify that the information is valid?  What process, sources, and analysis is used?
  • Are they providing data to other customers for the same purpose you need the information for?  How many/what portion?
  • What is their repeat business rate?  Who are their top customers?
  • What purposes are their customers using their data?
  • What do they do to verify and validate your data prior to providing it to you?
  • What do they do to verify that the data they are providing is complete?
  • What guarantees do they or will they provide that the data meets your specifications and quality standards?
  • What is required on your end to validate that the data is accurate and reliable?
  • If you are purchasing tracking data (real time/period feeds), what initial and regular testing processes used to verify proper data transfers?
  • What is required on your end to ensure the data transfer is working initially and ongoing?

What have you done to ensure data from service providers is what you want?

Filed under: data quality, , , , ,

Archiving Strategy: Data Relevance

We often think of the relavence of data when we want to include or exclude it from analysis or process.  However, are you thinking about relavence as part of your data quality effort?

Just as you focus data quality efforts to clean existing information, there are invariably records that can’t be cleansed or enhanced.  They have no value in either business analytics or business process.  They are noise, similar to the noise you have when there is bad data.  To save and maintain them in your database can affect your ability to accurately analyze information, continue to deflate confidence in data, and if a significant percentage of your database, will cause problems in performance and added maintenance.  Developing an archival strategy as part of your data quality practice is a significant component that should not be overlooked.

Benefits of Data Relevance

  • Trust in data
  • Enables process
  • Accuracy of analysis
  • Supports decisions
  • Database optimization

It can be tempting to simply delete records from your databases.  Though, this can have a detrimental affect due to data dependencies within your databases as well as causing non-compliance in regulated environments.  Instead, it is best to formulate a strategy that flags non-relevant data removing or suppressing it from user interfaces and analytics.

Components of Archiving Strategy

  • Data decay rates – Attributes of records that loose relevance over time.  This component is a good guide on the frequency at which you will focus cleansing efforts.  It also provides an indicator on when data is approaching a horizon when a record will lose its relevance.  Age of the data and activity related to a record, even if a record is complete, can signify whether the data is relavant and open to archiving.
  • Minimum requirements of record viability – Records should continually be assessed to determine if they meet the minimum standards of use.  Failure to meet minimum requirements is a leading indicator that the record is a candidate for archiving.
  • Relevance of record to analysis, process, decisions – If a record is not going to be used in analysis, process, or decision making, there is not need to keep it in use.  This may be the case if processes have been optimized and certain information is no longer needed.  Or, it could be that it was a candidate for archiving due to decay rates and minimum data requirements.  Additionally, relavance may be determined when integrating systems where old records with old transaction history is not relevant to the existing or new business.
  • Regulatory compliance – In highly regulated environments like health care, there are standards on what you can and cannot remove.  Records may not be useful in existing process, analysis, and decision making, but might be required in certification or other compliance related activities.  Archiving ensures that information is not deleted from primary systems.  Although, you may have to provide a mechanism that provides adequate access to data for compliance.

An archiving strategy is a critical component of data quality best practices.  It will continually help you focus on improving and refining your data quality projects as well as thinking strategically about how you use and manage your data on a daily basis.  Establish an archiving strategy at the forefront of your data quality initiatives and you start your efforts off on the right foot.

Reblog this post [with Zemanta]

Filed under: Uncategorized, , , , , , , , , , ,

Stuck in First Gear

porscheBig investments were made in recent years in IT.  IBM, Oracle/Siebel and SAP lead the market and were successful not only with the multi-national enterprise companies but, also with mid-sized companies.  There are a lot of companies out there that have purchased application and data management/data warehouse solutions only to find themselves using a portion of what it could do.  It’s like driving a Porcshe in first gear.

There are some fundamental reasons for this, outside of the fact that companies may feel it is the fault of their sales execute selling them the wrong bill of goods.  IT will blame the business for not knowing what it wants.  The business will blame IT for not getting it.  Doesn’t really matter, there is plenty of blame to go around.  What matters is that now you have a solution that isn’t giving you the benefits that it really could and should be.

Maybe I’m a bit biased since I’m the data chick.  Well, more than a bit.  Regardless, I think that from a data management perspective, companies are failing.  The maniacal focus on process efficiency has drowned out the fact that process runs on data and feeds data.  This focus has put data in the back seat too long and now when we need it to better understand our customers, our business, and make decisions, it is sorely lacking.  Our data lacks unity, structure, definition, and most of all purpose.  Companies simply cannot leverage their information except at very basic levels.  When things are good, this may be okay.  When things are bad, this is a real problem.

What makes this even more sad, is that companies are looking to spend more money on applications and data infrastructure to ‘fix’ the problem.  The promise of the new model and more sophisticated bells and whistles that will solve anything you throw at it is just marketing.  Until you can understand and control what you already have under your hood, getting something bigger, better, and shinier isn’t going to help anymore than it does now.  So, there was no ROI on existing purchases and there won’t be any ROI on new purchases.

There are two things companies need to do to make the investments in enterprise solutions worthwhile:

  • Clean-up the back-end data management practice so that it is fluid with business process and application usage.
  • Have a clear data management strategy for new applications that is fluid and scalable outside of application databases.

Your company may already be embarking on SOA or MDM projects.  But, have you looked at how these new practices will support applications outside of changing the oil?  Can the data drive process?

Today, applications are bogged down because data is treated as something to put in the trunk and horde.  Until data is thought of as fuel, you’re IT investments will stay in 1st gear and never get to 6th.  Now how fun is that?

Reblog this post [with Zemanta]

Filed under: Uncategorized, , , , , , , ,

Topics

Linking

Bookmark and Share

Blog Archive