thewca / community-surveys Goto Github PK
View Code? Open in Web Editor NEWHome Page: https://thewca.github.io/community-surveys/2019/draft/
Home Page: https://thewca.github.io/community-surveys/2019/draft/
Initial questions:
I'm breaking this out from #5.
Translation:
Technical:
We would like as many responses as possible to be tied to WCA accounts, in order to be able to answer questions about specific populations (e.g. competitors in a certain region, with certain experience, or who hold certain WCA roles). However, this may be a high barrier to some.
For the time being, I would still like to plan to keep digital responses tied to WCA accounts. but we should check how isolated some of these communities are from WCA systems. It's easy to open up the survey to unverified respondents, but we wouldn't have the same confidence that they represent a distributed community view.
Some competitions are listed as being organized by regional organizations instead of individuals. If possible, we should try to collect a list of the actual individuals, so we can give them the organizer's version of the survey.
Alternative: We can also make it easy to give someone the organizer's version if they request it (it's not secret), and then later determine who was really an organizer.
To adapt:
- Where do you usually get your cubing news? (if the question is better as yes or no, please rephrase as "Do you usually get your cubing news through WCA social media?")
- What content is most interesting to you on WCA social media? (If the question is better as yes or no, please rephrase as "Do you find the WCA social media content to be informative and interesting?")
- Does the WCA communicate well with the community about what is going on within the organization?
- Did you know that there is a weekly announcement of new records/competitions on Reddit and Twitter and if not, would you like to see a link across all our social media?
Here are the guidelines that I believe should orient the design and conduction of the 2019 WCA Community Survey.
Here's a book I found that could be useful. It's called "Designing and Conducting Survey Reseach: A comprehensive search", by authors Louis M. Rea and Richard A. Parker. It thoroughly addresses the whole process of surveying, from designing questions to processing data.
Here's a website that succinctly tackles the same topic.
And here's the Google Doc where the writing and editing of the first draft of the survey will take place. Anyone with a @worldcubeassociation.org email can edit it (contact me if you can't). Anyone with a link to the document can make comments and suggestions. Please feel welcome to suggest any new question you might find relevant, as well as any improvments to existing ones.
Here are the guidelines we already have on the first draft Lucas presented:
Each survey will be tied to a WCA website account.
< We can do this using custom variables with SurveyMonkey using their paid plan. This is $32 for one month (or $23 if we demonstrate non-profit status), which I (Lucas) am happy to contribute personally.
< This means responses from competitors with WCA IDs can be associated with their competition results. This allows us to break down results by region, or by how recently someone competed.
Results will be available to any WCA staff member who asks the WCT.
< This survey is designed so that we don't collect any sensitive information. We'll probably want to run bucketed or anonymized stats based on competitor ages, but this should be restricted to a very limited number of software team members.
Every answer except the first one is optional.
Right from the get-go, the goals of conducting the survey should be clear. I believe these are the main goals of this process:
Measuring the community's opinion on the competition experience and how it can be improved;
Understanding the nature of its interaction with the WCA as a whole and gathering the data that allows us to improve the quality of it.
Asking how it feels about a few particular issues;
With these goals in mind, a few things should be clear:
The survey is not a referendum, particularly when it comes to questions relating to the Regulations. The results we get are not binding and serve only to subsidize WRC's and the WCA's activities in general.
The data gathered that is not going to be publicly released and is for WCA Staff use only (Although I think it's worth it to put together a public report with some of the less sensitive data).
Designing and conducting the survey is only half the process. We'll need to carefully process the data gathered after surveying is done.
With these things in mind, we can start talking specifics.
There are some pros and cons of conducting web-based surveys that I think we should keep in mind while conducting this process (taken directly from pages 12 and 13 of the book I linked):
Advantages:
Disadvantages
I believe that if we are mindful of these characteristics when designing, conducting and interpreting the survey, we can reach a better, more useful result.
Here are the four axes of work on the logistical side of the survey that should be discussed on an issue of their own:
Guidelines for the Questions: standardizing the phrasing of questions so they are as clear and concise as possible (e.g. the balance between open and close-ended questions, the length of the survey, etc.). The book I linked dives deeply into this topic and should be very useful!
Community outreach strategy: what are we going to do to get as many people to participate in the survey as possible (e.g. translation; public campaign in conjunction with the WCT; possible email notification of all registered speedcubers about the email; contact with local leaders and regional organizations, etc)
Technical aspect of the Survey: issues relating to setting up the technical aspect of surveying (e.g. linking survey monkey with WCA IDs; beta testing the survey; setting up translations on the survey monkey website; etc)
Survey Report: processing and analyzing the data gathered and compiling into a document (the document should also contain a short introduction of the original goal of the survey, the methods using for collecting the data and the questions asked).
If we could, I think having a WRC member leading each of these topics would get things to run smoothly.
Moved to #9
We are considering having surveys that the Delegate can hand out at competitions in regions that have them during the survey period and who don't necessarily have common internet access. We can minimize the Delegate's work as:
We can ask the person collecting surveys to verify the competitor's WCA ID, if possible. If everyone has a name tag, it should be possible to get reasonable confidence that we're getting at most response from each actual actual competitors. During analysis, we'd obviously be able to tell them apart from responses digitally tied to WCA accounts (and if we have a submission on behalf of the same person both online and printed).
If the survey respondent has organized a competition recently, we can show them an extra page of questions. Possibilities:
Possible questions about the website and its features:
Do you find the WCA website homepage useful?
Would you like to see rankings based on gender on the WCA website?
What are new features you'd like to see on the WCA website?
[free form response]
For now I couldn't think of any other relevant questions related to the website.
It makes sense to get in contact with the WST to see how they feel about each of these and to see if they'd like to ask any others.
Here's a couple of links that might be helpful on setting up the survey on Google Forms.
It has a pre-filled link feature that we could use.
How to pre-populate Google Forms using UTM parameters
Here's how that feature might work.
We could have links that prefill whether you are a delegate or organizer, a new or experienced competitor, etc.
This could, however, lead to malicious responses as one could manually say they are a delegate/organizer when they are not.
As Lucas mentioned, it would cost about US$1.000 to use custom variables on a yearly plan in Suvey Mokney.
[Moved into the repo]
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.