Creating a Natural History Survey

From WikiAdvocacy
Revision as of 16:27, 26 February 2014 by Advocacy Admin (talk | contribs) (Created page with "Some organizations might want or need to create a family survey form to collect patient history information from their families. There are different ways you can create this t...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Some organizations might want or need to create a family survey form to collect patient history information from their families. There are different ways you can create this type of survey in order to achieve identifiable outcome measures. Here are some recommendations:

  • PXE International uses Genetic Alliance Registry and BioBank – it allows customizable surveys/questionnaires/natural history / epidemiology - we have 15 years of longitudinal data using that system on 3000 people with pseudoxanthoma elasticum (PXE).
  • We (M-CM Network) are running a multi-page anonymous survey right now with SurveyMonkey. We have the Gold plan for $300/year. I'm not sure what "identifiable outcome measures" means, so I can't speak to that specifically. Overall we have found it to be robust platform and have been happy with it. It has been put together and managed by a non-technical person. It exports a PDF of the survey that we can send to people that don't want to complete it online. The questions themselves were drafted by a graduate student with the aim of establishing some basic natural history information in the short term while we build support for a full longitudinal registry. I wouldn't suggest Survey Monkey for a longitudinal registry, only for a one-time survey. Survey Monkey Pricing Details
  • Zoomerang does statistical analysis on the fly, so I think Survey Monkey does too. Make sure you have IRB approval if you plan to use the data in any way beyond your own use – and even then, I would get it anyway. Genetic Alliance Registry and BioBank has the equivalent of SPSS stats on the fly.
  • In addition to the resources offered by Genetic Alliance, you might wish to check with the Office of Rare Diseases Research (ORDR) to see if they can offer guidance on building a database that complies with recognized standards for this kind of data collection. ORDR is in the process of coordinating the efforts of such registries to that they can "cross-communicate" and collect similar types of data. For example here are the elements they believe all of us should be collecting: NIH Rare Diseases Resources Also, Liz Horn has been writing terrific tips for registries and biobanks: Biobank tips And you can subscribe to her biobank and registry bulletin: Just tell Tam Nguyen, Web Technology Specialist tnguyen@geneticalliance.org to put you on the list.
  • Designing a good questionnaire is essential to getting good data. (Questionnaire design is both a skill and an art form). In week 7 we covered Questionnaire Design and in week 11 we covered developing the Questionnaire/ Pre-technical Planning. For what you are describing, you will need IRB approval. We have a number of resources on the IRB, including week 13, week 20, and week 29 Weekly Tips. We recently did a webinar on Demystifying the IRB that may help. We have also archived the first year of Wiki Advocacy Weekly Tips, centered around 5 topic areas. Genetic Alliance Institutional Review Board is very experienced at reviewing these instruments, and is generally less expensive than other free standing IRBs.
  • When designing any question, you need to think about how you want to use the answer. Is it to inform generally? Or do you want to be more specific.
    • To be able to say X % of people prefer Y?
    • When you have checkboxes, will the user select one answer or check all that apply? It changes how you can present the data.
    • Also think carefully if you you use a yes/ know/ don't know answer choice. When you analyze the data - how do you consider the don't knows?
    • Sometimes you really just want a yes/ no. (This can be problematic with family history).
    • Free text causes challenges with analysis, but there are times when it is necessary.
    • Consider the length of the questionnaire. Sometimes people get fatigued from answering too many questions or too many ambiguous questions.
  • We recently launched a very detailed survey and it's been very interesting to see the data coming back in and how we could have gotten more helpful data with better questionnaire design. I just came across this article about survey design and it touches on several things that we have already learned from this process, so I thought it may be useful to other organizations. It is aimed at start-ups doing surveys to evaluate use of software products, but I think that most, if not all, of the points are universal - Design Staff Survey We are still early in the process of evaluating our incoming data, but these are the ones that I was nodding my head at when I read this article:
    • Only ask what you need to know and can act on
    • Every survey question should have an actionable outcome
    • Pre-test your survey
    • Avoid vague terms

So here's a very specific example from our survey. M-CM has vascular birthmarks as a characteristic, and the naming of these birthmarks is wildly inconsistent across specialties, localities, even from doctor to doctor. Our survey said: do you have any of these skin findings, and then a long checkbox list of the clinical names that are used by everyone (doctors, researchers, patients) inconsistently. As I look over the data, it is almost meaningless, because it is very likely that people are calling the same things by different names. If we had thought about why we were asking that question and what we would do with the data, then I believe we would have anticipated this problem and designed that question in one of a number of different ways, none of which included clinical names for birthmarks. At this moment, my thinking is that I want every question to be driven by an objective (actionable outcome) or it gets cut (Keep surveys short). The objectives should come before the questions in the development process. We also had some grids in our survey as discussed at the end of the article, so I thought that was useful. Our survey was about syndrome natural history information, but we could very well be conducting surveys about patient experience (the diagnostic journey, identifying appropriate specialists) that would have more relevance for some of the marketing oriented points in the article.