Thursday, 30 April 2015

Assessing the impact of #ict4d interventions: Going beyond access and infrastructure indicators

One the many challenges of a development practitioner is to assess the impact of development interventions. When you compound this with also figuring out how an ICT4D component has helped or hindered development and progress, this may become a bit challenging.

While the development community has comprehensive set of indicators for rural development and agriculture-related interventions, we are lagging a bit behind vis-a-vis ICT4D indicators.

This said, our ITU colleagues have compiled a set of core ICT indicators covering access and infrastructure.  While this is commendable, these two set of indicators are not sufficient to tell the full development story. For one thing, for example, the access indicators are limited to the physical access to ICTs and do not take into account aspects such as literacy. As such coming up with a comprehensive set of ICT4D indicators is up for grabs.

So here is what I've been thinking about, and would like to know if we were to complement access and infrastructure indicators with  appropriateness of ICTs for the target population and how these are used and the extent to which they contribute to transformation at social and economic level, would this be a good starting point to come up with a comprehensive set of ICT4D indicators?

Another domain where we require indicators is that of national policies so that we can assess whether or not these are conducive both for the target population and potential investors, whether policies allows open and transparent competition.

Last but not least, the sustainability of the intervention and its potential for scaling up could constitute another domain.

We know that development interventions have their own set of indicators. I am now totally convinced that the only we can assess the impact of an ICT4D intervention for all different perspectives and angles is to embed the specific ICT indicators as part of the overall development project, as opposed to having standalone indicators. This will allow us to have  a better grasp as to how and if the ICT4D intervention has contributed to the overall socio-economic development impact. 

In terms of infrastructure and access,  the ITU indicators provide statistics as to how and whether individuals, households and businesses have access to landline, mobile phone, extent of mobile phone penetration and use, number of computers, availability and use of broadband, etc. 

Moving now to the proposed domains - in terms of appropriateness, how can we assess whether a technology is appropriate? How can we assess if a service delivered thanks to a technology is appropriate? Could we say that if a household is willing to spend x% of its disposable income on an ICT service, that makes it appropriate? Can affordability be a parameter? Could we say that if a community has owned the technology that makes it appropriate? What about the cultural appropriateness of a technology?

I would say definitely locally relevant content is something that we should take into account, along with how technology has contributed and provides for  social and economic opportunities for progress. 

On the transformational side, one indicator to consider is whether the introduction of ICTs has led to the community acquiring new skills and whether there was any type of capacity development both at individual and/or institution level. This could be anything from improved negotiation skills, to acquiring technical knowledge on the use of the technology, to automation of manual tasks, leading to transition from semi-skilled to skilled labour. 

Another indicator could be whether the introduction of ICTs has contributed or enhanced social inclusion and interactions.

Taking this further, we could  examine whether the timely access to information has led to better decision making and whether the introduction of ICTs has been an impetus for increased and improved local content creation leading to the demise of information gatekeepers.

Last but not least, in this category perhaps another indicator could be the extent to which ICTs were equally available to women and young people and how and if this has led to their empowerment and positioning them on an equal footing with other members of the community.

As far as the economic indicators are concerned some obvious ones are how and if ICTs have:
  • created new employment opportunities and if so has this been in the formal or informal labor market, off-farm or on-farm; whether new businesses were formed and how has it contributed to enhancing bargaining power of the beneficiaries. 
  • led to creating a vibrant rural environment which has helped curtail the migration from rural to urban areas 
  • contributed to increase in income and what is the percentage of increase in GDP thanks to deployment of ICTs. Taking this further, I wonder if we can go as far as being able to give figures of people lifted  out of poverty thanks to a specific ICT or thanks to a specific ICT4D intervention
  • led to an increased expenditure in this sector at household level. Can we assume that if there is an increase in expenditure  it is because the household finds the technology appropriate and the content it is delivering appropriate?
Moving on to the policy level, here is a menu of option:
  • are ICTs part of sectoral national policies. For example, is the agriculture, health or education national policy ICT enabled
  • does the country have a national technology policy and if so does it advocate for universal access and in what form
  • are the national policies conducive for creating the right environment for public-private-people partnership
  • do national policies encourage public and private sector to invest in ICTs
  • do national policies  foster competition and transparency
  • are the ICT policies gender and youth sensitive - do they ensure equitable access 
Last but not least on the domain of scalability and sustainability, I guess we should be assessing the degree to which the ICT4D intervention responded to and met the needs of the local communities and assess the sustainability of the intervention once the funding is over. This could be in terms of knowledge transfer to maintain and operate the technology; the sustainability of the business model in the case that the ICT4D intervention led to creation of a business and assessing the prospects of expansion.

In terms of scalability we would need to assess the replicability of the intervention. Here I am not talking about a cookie cutter approach, as this never works. I am talking about understanding and assessing the context and evaluating the feasibility of replicating an experience in a similar environment and/or  assessing what modifications need to be made so that it can be replicated in a different context. We know that 9 times out of 10, this would require tweaking and adaptation to meet the needs of  the local population and respond to local reality.

To conclude, I am putting on the table some of my thoughts and I would like to seek your views and guidance on what could potentially be a sound set of indicators to assess the social, economic, political impact of ICT4D interventions?

And lastly what do you think are or should be the ingredients of a "successful" sustainable and scalable ICT4D intervention?

Monday, 20 April 2015

Surveys are Us... Tips to design engaging and meaningful surveys #kmers

When was the last time you completed a survey/questionnaire that had self-explanatory and clear questions and you could figure out why it was important to complete the survey? When was the last time in completing a surverycould get a good feel as to how  the results will serve as an input to bring about change?

Whether we like it or not, surveys  have become a staple of modernity. Be it that they are customer satisfaction surveys, be it that they are census, opinion polls, household surveys, attitude surveys or surveys for research purposes.

Typically surveys are used to:

  • get feedback from "audience"
  • collect information/statistics about an "audience" group
  • understand the "audience" needs, challenges and opportunities in an effort to make informed decisions 
  • assess the impact of an intervention and/or activity on an "audience"
Should you need to design a survey, you may wish to take a few minutes and consider the following:
  • What's purpose of the survey? Why are you designing and launching it?
  • What type of data/information and feedback you wish to collect?
  • Who is your audience? (demographics, social and economic status, occupation, literacy rate)
  • Why is it important to reach out to your audience?
  • How will you be using the results, outcome and data from your survey?
  • What is your plan to share the survey results with your audience?
  • Is the survey a one-off, a follow-up to a previous survey, part of a research project?
  • Are you undertaking the survey on-behalf of a third party?
Survey goal and purpose (the why)
Once you have clearly identified and articulated the goal and purpose of your survey, make sure you write it down, because this is what you need to use as your survey's introductory text!  Include the survey deadline along with   how and when you'll be sharing the survey results in your introductory text.

Know your audience (the who)
Surveys are a communication tool, as such it is important to know who is the target audience. Here are some points to consider:
  • know who you wish to reach, as this will determine the questions you will be asking, the timing and format in which you will be sharing the survey 
  • provide clear instructions, including what you will be doing with the results and how long it will take them to complete it. Be honest, if it takes 20 minutes to complete the survey, say so, as otherwise you may risk putting off your audience and end up having fragmented and unusable data
  • make sure your survey is tailored to the literacy level and language group of your audience
  • if you opt for on-line survey, make sure your audience has access to the appropriate technology
The questions (the what)
Once you've figured out why you are doing the survey and who is your target audience, you need to compile your questions. One overarching tip for formulating powerful questions, is to KNOW what you will do with the responses. If you cannot figure out how you will use the response of your question, either reformulate the question, or opt to drop it. You may find the following suggestions useful:
  • craft clear and concise questions (preferably in plain English and jargon free)
  • ask one question at a time. Do not stack your question and do not AND/OR in your question
  • use multiple choice, true/false, checklist and rating scale as these make your compilation task easier and you do not run the risk of having to interpret the response
  • use even number for rating scale type questions (for example: 1-4 where 1 is poor and 4 is excellent). This way you will encourage the respondent to provide a meaningful answer as opposed to settling for the middle ground. This said, where appropriate provide N/A (not applicable) option. Make sure you assign numeric values to your  rating/scale questions. This will facilitate the compilation
  • where applicable and appropriate consider asking the respondent to provide the occurrence of an activity as opposed to simply asking them to give you an approximation such as  never, seldom, often, always
  • keep open-ended questions to a minimum (structured questions make compilation work easier and you do not risk falling into the 'interpretation' trap)
  • group questions logically and if appropriate breakup your survey in logical sections
  • figure out which questions are of utmost importance - for which you require an answer or else your survey will be void - and make those mandatory
  • include demographics (such as gender, age etc) so that you can disaggregate the results. If your survey is anonymous, make sure the demographic questions are in-line with your anonymity framework
  • use validation questions as appropriate 
Format (the how)
Once you know who is your audience, you will be able to decide whether to opt for electronic, print or interview format (in person, telephone). In determining the format, consider the following:
  • access to technology
  • literacy rate. In case of low literacy rate, you may opt for a pictorial version of the survey or conduct a face-to-face or phone interview
  • respondent's cultural context and make sure you are gender sensitive
If you opt for on-line or print format (mailed or manually distributed), remember the eye wants it share as well. Make sure your survey is well-formatted and visually appealing. In case of print survey, allow enough space between questions and allocate adequate space for  response to  open-ended questions.

Timing (the when)
The timing of a survey can contribute to higher response rate. Knowing your audience will help you decide when is the best time to launch your survey. For example, if you were to survey farmers, you would try to avoid  peak harvest time, as  you know they will be busy in the fields and have other priorities.

I am adding the survey deadline under this heading. Decide how long you'll be running your survey. Seven to10 days seems to be the norm. Send a reminder four and two days before the survey's deadline.  

Field test
Put yourself in the shoes of the respondent and think of the frustration in completing a survey that does not work, or has unclear questions. 

This is why it is really important to field test your survey before launching it. By field testing, I am not just talking about making sure the technical and mechanical part works. More importantly, the field test is to assess if all your questions are clear, make sense and relevant to your audience.

For field testing, choose people who were not involved in the design process. If you can afford the luxury of having someone from your audience group, go for it and have them complete the survey. That would be the best litmus test.

Response rate
While it makes total sense to aim for 100% response rate, conventional wisdom says that average response rate for on-line surveys is 30-40% and 60-70% for mailed ones. 

Results
You would hope that all your respondents have duly completed the survey. This is why it is important to decide which are your mandatory questions so that you avoid the risk of getting partial responses which could jeopardize the validity of your efforts. 

Read carefully the answers to the open-ended questions. To the best of your ability, try to stay as objective as possible. This is why it is best to keep these types of questions to the bare minimum.

Once you've compiled the results and you get a good understanding of what the results are telling you, share it with your respondents. You may do so in a narrative form complementing it with graphs and charts.

In sharing the results, depending on the type of survey, let your respondents who were diligent enough to complete the survey know how and when you'll be taking action.

Resources
While I know the above is far from being comprehensive. Nonetheless, I hope you find it useful. I encourage you to also check out The University of Wisconsin Survey Center for valuable resources and guidance on how to design and implement surveys.





Thursday, 9 April 2015

The role of media, journalists and reporters in a networked global world #globaldev


Reading the New York Times Op’d “Yes, we were warned about Ebola”, I kept thinking  how can we make sure mainstream media covers development and humanitarian issues before these transform and become catastrophes? 

I kept asking myself how can we influence the “subjective” selection of news and raise awareness that in our networked world the outbreak of a disease, drought and famine miles away from our home, wars in distant lands, genocides among cultures different than ours, extreme weather conditions in different parts of the world, while at prima facie may appear as local news ultimately will have dramatic and serious impact at a global level, thus impacting us as individuals.

So would not it be better if the pseudo local news is covered adequately from the outset, as opposed to when the damage is done?

Unlike the medical profession that encourages prevention as a cure, the media seems to prefer to wait until some “newsworthy” event - aka catastrophe - happens to then “run the “news”. This is not because reporters and journalists are evil people, it is because that is the nature of the beast. 

If we look back in time in the novel Scoop, Evelyn Waugh described news as follows: “Look at it this way. News is what a chap who doesn’t care much anything wants to read”.

Couple of decades earlier, Pulitzer talks about news as “What is original, distinctive, dramatic, romantic, thrilling, unique, curious, quaint, humorous, odd, apt to be talked about, without shocking good taste or lowering the general good tone, above all without impairing the confidence of the people in the truth of the stories of the character of the paper for reliability and scrupulous cleanliness.”

More recently Alain De Botton defined news as "The determined pursuit of the anomalous.”

John Bogard a century ago shares his view on what constitutes news: “When a dog bites a man, that is not news, because it happens so often. But if a man bites a dog, that is news.” 

I wonder whether the reporters covering dog bites will ever consider the incidence of rabies before deciding to discard the news of a dog biting a man  as “non-news” as opposed to  wait until rabies in the neighbourhood becomes a pandemic to cover the story.

Let’s pause a moment and remind ourselves that the goal of journalism is to keep citizens informed and appraised of the news that may affect them as individuals or impact their communities. And yes, journalism also has a watchdog function, to report on what governments are doing. And we all know that news has a cycle. Something that is news today, may not be news in three days time. 

While this may be true for what I label as conventional news covered by major media outlets, this is not the case for development news. 

Development-related news is always NEWS. Development-related news remain news until such time that there is no famine, no drought, no adverse climatic event, no epidemic, no disease outbreak, no child malnutrition, no poverty, no landgrabbing, no gender inequality, no child labor, no exodus of displaced people, no overcrowded refugee camps, no food shortage, no genocide and no humanitarian crisis.

After all, is not covering the news a means to provide facts and give context while bringing attention to global and local issues? So, why is it that there is no steady flow of development related news and why is it that we still have not cracked this nut?

The Economist piece Coming full circle argues that “The biggest shift is that journalism is no longer the exclusive preserve of journalists. Ordinary people are playing a more active role in the news system, along with a host of technology firms, news start-ups and not-for-profit groups. Social media are certainly not a fad, and their impact is only just beginning to be felt. “It's everywhere—and it's going to be even more everywhere,” says Arianna Huffington. Successful media organisations will be the ones that accept this new reality. They need to reorient themselves towards serving readers rather than advertisers, embrace social features and collaboration, get off political and moral high horses and stop trying to erect barriers around journalism to protect their position. The digital future of news has much in common with its chaotic, ink-stained past.”

Pulitzer prize winner, Max Frankel said: “since no one can precisely define the nature of news, virtually anyone can claim to be a journalist.” And the 2006 Pew Internet and American Life Project shows that 34% of bloggers consider their blog as a form of journalism.

It’s fair to say that there are not enough journalists to cover  all the news. And today, thanks to the advances in technology and the increasing acceptance of crowdsourcing as a form of reporting, we still do not seem to be able to provide adequate coverage of development-related news.


I wonder whether the world of journalism, reporters and media in general would consider leveraging development workers as eyewitnesses, allowing them to contribute to the news agenda and advocacy journalism. 

By doing so, they will have a continuous and steady flow of information and news not only to raise awareness about global events, facts and realities that sooner or later will impact people’s lives at all levels, but also take a proactive role in contributing to what Philip Graham called “first rough draft of history.”