Giter Club home page Giter Club logo

community-surveys's People

Contributors

lgarron avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

andrewtyberg

community-surveys's Issues

Gather questions from teams/committees/councils

Initial questions:

  • WCA Communication Team (WCT) — Communications section
  • WCA Competition Announcement Team (WCAT) — Question about WCRP naming policy
  • WCA Disciplinary Committee (WDC) — Disciplinary Actions section
  • WCA Data Protection Committee (WDPC) — Question about rankings by age
  • WCA Ethics Committee (WEC)
  • WCA Financial Committee (WFC) — no questions
  • WCA Marketing Team (WMT) — branding
  • WCA Quality Assurance Committee (WQAC)
  • WCA Regulations Committee (WRC) — Regulations sections
  • WCA Results Team (WRT) — no questions
  • WCA Software Team (WST) — Website section
  • WCA Advisory Council (WAC) — Public Relations section (?)

Timeline

I'm breaking this out from #5.

  • We should leave at least 48 hours for any step with WCA staff feedback (ideally 72 hours).
  • We should do things in parallel when possible (e.g. iterating on questions, testing the form on the survey site, testing the translation process)

Timeline

  • 2019-09-19 Presentation of the first draft of the survey to the Board & Leaders.
  • 2019-09-26 Launch staff/organizer survey (English only)
  • 2019-10-03 End of staff/organizer survey period (consider extending to 2019-10-06 based on response rate)
  • Week of 2019-10-11 Launch public survey (ideally with a Regulations draft ready)
  • Week of 2019-11-18 End of public survey period

Translation:

  • Initial technology test/translations during staff/organizer version
  • Finalize at end of staff/organizer version based on feedback

Technical:

  • 2019-09-22 Make version in SurveyMonkey
  • 2019-09-22 Generate tokens on website

Should we require WCA accounts for responses?

We would like as many responses as possible to be tied to WCA accounts, in order to be able to answer questions about specific populations (e.g. competitors in a certain region, with certain experience, or who hold certain WCA roles). However, this may be a high barrier to some.

For the time being, I would still like to plan to keep digital responses tied to WCA accounts. but we should check how isolated some of these communities are from WCA systems. It's easy to open up the survey to unverified respondents, but we wouldn't have the same confidence that they represent a distributed community view.

Identify organizers who are not listed by profile

Some competitions are listed as being organized by regional organizations instead of individuals. If possible, we should try to collect a list of the actual individuals, so we can give them the organizer's version of the survey.

Alternative: We can also make it easy to give someone the organizer's version if they request it (it's not secret), and then later determine who was really an organizer.

Questions from the WCT

To adapt:

  • Where do you usually get your cubing news? (if the question is better as yes or no, please rephrase as "Do you usually get your cubing news through WCA social media?")
  • What content is most interesting to you on WCA social media? (If the question is better as yes or no, please rephrase as "Do you find the WCA social media content to be informative and interesting?")
  • Does the WCA communicate well with the community about what is going on within the organization?
  • Did you know that there is a weekly announcement of new records/competitions on Reddit and Twitter and if not, would you like to see a link across all our social media?

General Guidelines for the 2019 Community Survey

Here are the guidelines that I believe should orient the design and conduction of the 2019 WCA Community Survey.

Here's a book I found that could be useful. It's called "Designing and Conducting Survey Reseach: A comprehensive search", by authors Louis M. Rea and Richard A. Parker. It thoroughly addresses the whole process of surveying, from designing questions to processing data.

Here's a website that succinctly tackles the same topic.

And here's the Google Doc where the writing and editing of the first draft of the survey will take place. Anyone with a @worldcubeassociation.org email can edit it (contact me if you can't). Anyone with a link to the document can make comments and suggestions. Please feel welcome to suggest any new question you might find relevant, as well as any improvments to existing ones.

General Guidelines

Here are the guidelines we already have on the first draft Lucas presented:

Each survey will be tied to a WCA website account.
< We can do this using custom variables with SurveyMonkey using their paid plan. This is $32 for one month (or $23 if we demonstrate non-profit status), which I (Lucas) am happy to contribute personally.
< This means responses from competitors with WCA IDs can be associated with their competition results. This allows us to break down results by region, or by how recently someone competed.
Results will be available to any WCA staff member who asks the WCT.
< This survey is designed so that we don't collect any sensitive information. We'll probably want to run bucketed or anonymized stats based on competitor ages, but this should be restricted to a very limited number of software team members.
Every answer except the first one is optional.

Right from the get-go, the goals of conducting the survey should be clear. I believe these are the main goals of this process:

  • Measuring the community's opinion on the competition experience and how it can be improved;

  • Understanding the nature of its interaction with the WCA as a whole and gathering the data that allows us to improve the quality of it.

  • Asking how it feels about a few particular issues;

With these goals in mind, a few things should be clear:

  • The survey is not a referendum, particularly when it comes to questions relating to the Regulations. The results we get are not binding and serve only to subsidize WRC's and the WCA's activities in general.

  • The data gathered that is not going to be publicly released and is for WCA Staff use only (Although I think it's worth it to put together a public report with some of the less sensitive data).

  • Designing and conducting the survey is only half the process. We'll need to carefully process the data gathered after surveying is done.

With these things in mind, we can start talking specifics.

Web-based Surveys

There are some pros and cons of conducting web-based surveys that I think we should keep in mind while conducting this process (taken directly from pages 12 and 13 of the book I linked):

Advantages:

  • Convenience: This technique represents a convenient and efficient way of reaching potential respondents. They are able to receive the questionnaire and complete it in the privacy of their home or office. This advantage is becoming particularly significant as the availability of computers becomes increasingly widespread.
  • Rapid data collection: Information, especially information that must be timely (e.g., a political public opinion poll related to an upcoming election), can be collected and processed within days.
  • Cost-effectiveness: This technique is more cost-effective than the traditional mail-out survey because there is no need for postage or paper supplies. It is also more cost-effective than the telephone and in-person surveys because it is not at all labor intensive.
  • Ample time: The respondent is not pressed for time in responding to the web-based survey and has the opportunity to consult records in answering the questions. There is time to consider response choices and respond to open-ended questions in the form of text.
  • Ease of follow-up: Potential respondents can be reminded to respond to the survey through follow-up e-mail messages.
  • Confidentiality and security: Personal or sensitive information supplied by the respondents can be protected on a secure server through the efforts of the research team.
  • Specialized populations: The survey is particularly useful in reaching specialized or well- identified populations whose e-mail addresses are readily available. For example, we have successfully used this technique to conduct surveys of satisfaction among employees and stakeholders of large public organizations.
  • Complexity and visual aids: As with mail-out surveys, web-based surveys can
    use visual images and more complex questions.

Disadvantages

  • Limited respondent bases: A major disadvantage of this technique is that it is limited to populations that have access to e-mail and a computer. Furthermore, the technique assumes a certain minimal level of computer literacy that is necessary for the completion and submission of
    the questionnaire. Such literacy is improving rapidly within the general population.
  • Self-selection: As in the traditional mail-out, there is a self-selection bias that leads to lower response rates. Those who do not use e-mail or are not comfortable with web-based technology exclude themselves from the sample. Also, individuals with reading or language issues tend not to respond to web-based surveys. Some researchers send the survey by e-mail in multiple languages in an effort to obviate this problem.
  • Lack of interviewer involvement: Since there is no interviewer involvement in the web-based survey, unclear questions cannot be explained, and respondents may not follow instructions. These problems can seriously compromise the scientific reliability of the survey even though telephone contacts are provided to the respondents in the event that they need help.

I believe that if we are mindful of these characteristics when designing, conducting and interpreting the survey, we can reach a better, more useful result.

Deeper discussion

Here are the four axes of work on the logistical side of the survey that should be discussed on an issue of their own:

  • Guidelines for the Questions: standardizing the phrasing of questions so they are as clear and concise as possible (e.g. the balance between open and close-ended questions, the length of the survey, etc.). The book I linked dives deeply into this topic and should be very useful!

  • Community outreach strategy: what are we going to do to get as many people to participate in the survey as possible (e.g. translation; public campaign in conjunction with the WCT; possible email notification of all registered speedcubers about the email; contact with local leaders and regional organizations, etc)

  • Technical aspect of the Survey: issues relating to setting up the technical aspect of surveying (e.g. linking survey monkey with WCA IDs; beta testing the survey; setting up translations on the survey monkey website; etc)

  • Survey Report: processing and analyzing the data gathered and compiling into a document (the document should also contain a short introduction of the original goal of the survey, the methods using for collecting the data and the questions asked).

If we could, I think having a WRC member leading each of these topics would get things to run smoothly.

Schedule

Moved to #9

Paper Surveys

We are considering having surveys that the Delegate can hand out at competitions in regions that have them during the survey period and who don't necessarily have common internet access. We can minimize the Delegate's work as:

  • Print the surveys (they should be a single 1-sided page, to make this easy)
  • Collect surveys
  • Send us a photo album with a photo of each survey (which we can transcribe)

We can ask the person collecting surveys to verify the competitor's WCA ID, if possible. If everyone has a name tag, it should be possible to get reasonable confidence that we're getting at most response from each actual actual competitors. During analysis, we'd obviously be able to tell them apart from responses digitally tied to WCA accounts (and if we have a submission on behalf of the same person both online and printed).

Questions for Organizers

If the survey respondent has organized a competition recently, we can show them an extra page of questions. Possibilities:

  • Are you experiencing ongoing issues with new Speed Stacks equipment?
  • In 2020, the WRC plans to make scramble signatures mandatory. Do you foresee any issues with this given the resources normally available to you?
  • In 2020, the WRC plans to make 4b2++ (visually isolating the scramble table) mandatory. Do you foresee any issues with this given the venues and resources normally available to you?
  • 4b2a forbids giving access to scrambles/passcodes to anyone other than a Delegate listed for the competition, unless the Delegate is temporarily unavailable. Has the Delegate at one of your competitions ever had to give scrambles to someone else?
    • If yes, please list the reasons you have needed to do this.
  • Are there any possible changes to the Regulations that would make it significantly easier to run competitions? [free form response]
  • 6x6x6 and 7x7x7 are the only speedsolving events that are mean of 3 rather than average of 5. How big of a burden do you think it would be if we changed to average of 5 (to match the other events)?
    • TODO: describe the nuance of using lower cutoffs so that most competitors only do 2 attempts.

Questions about the Website

Possible questions about the website and its features:

  • Do you find the WCA website homepage useful?

    • If not, how could it become useful?
  • Would you like to see rankings based on gender on the WCA website?

  • What are new features you'd like to see on the WCA website?

    [free form response]

For now I couldn't think of any other relevant questions related to the website.

It makes sense to get in contact with the WST to see how they feel about each of these and to see if they'd like to ask any others.

Technical setup

Here's a couple of links that might be helpful on setting up the survey on Google Forms.

It has a pre-filled link feature that we could use.

How to pre-populate Google Forms using UTM parameters

Here's how that feature might work.

We could have links that prefill whether you are a delegate or organizer, a new or experienced competitor, etc.

This could, however, lead to malicious responses as one could manually say they are a delegate/organizer when they are not.

As Lucas mentioned, it would cost about US$1.000 to use custom variables on a yearly plan in Suvey Mokney.

Translate Surveys

  • Get reliable translators
  • Try to be consistent in wording
  • Set up translation tool

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.