I made an effort to call the CDC in order to participate in their survey. Although they weren't available at the time, I still hope to be able to participate. Sometimes the most effective method in understanding a concept is to actually experience it firsthand. One of our upcoming assignments is to join a youth program, and analyze it as a researcher. The purpose behind the analysis is similar to my desire to participate in the CDC's survey research. Perhaps after my literature review has been handed in, I will have the ability to realize this desire.
I finally finished reading the Creswell book, with the last two chapters on the purpose statement and on research questions and the hypothesis. Both chapters provide wonderful basic insights into setting up these aspects of a research paper, with actual scripts and useful word usage dedicated to the three different research designs. This information would assist in creating a base for my own research project, something that I can come back to when required.
According to Creswell, the purpose statement discusses "why you want to do the study, and what you intend to accomplish". Essentially, it's all about the intent and not really about the problem or issue that necessitates the research study. Additionally, the purpose statement is a vital piece of the puzzle that should be written separately from the introduction or other areas of a research paper. In regards to the research questions and hypotheses, Creswell eloquently states that "from the broad, general purpose statement, the researcher narrows the focus to specific questions to be answered or predictions based on hypotheses to be tested". These are perfect guidelines for writing the next section of a research study paper.
Creswell mentions a key element to a successful paper; a concept he calls 'signposts' for the reader. Each section should be clearly defined or delineated, to help the reader truly understand the steps of the process when reviewing the study. I also believe that creating these signposts will help guide the researcher along the study being conducted. I remember mentioning earlier in the semester that the reason I liked quantitative research was because I appreciated the step by step and clear process of an experiment, similar to conducting a science fair project. Ultimately, the research process and paper written on the study, adheres to a similar framework of step by step guidelines.
Saturday, October 22, 2011
Tuesday, October 18, 2011
Data Analysis & Writing Ideas
The main topic for this week was data analysis. Similar to almost everything so far in research, data is analyzed differently according to the type of research design. The link provided in the lecture to the University of the West of England's site, presented a clear overview for quantitative data analysis, specifically in relation to statistics: "This is the process of presenting and interpreting numerical data. Descriptive statistics include measures of central tendency (averages - mean, median, and mode) and measures of variability about the average (range and standard deviation). These give the reader a 'picture' of the data collected and used in the research project. Inferential statistics are the outcomes of statistical tests, helping deductions to be made from the data collected, to test hypotheses set and relating findings to the sample or population." Numbers, numbers, numbers! Qualitative data seems to be analyzed with more of an inductive process. Once the data has been organized, trying to identify and analyze it to gain insight into the research, is a great qualitative method. Miles and Huberman's process of data analysis was my favorite and I think will prove most helpful. Their three steps are actually ongoing throughout the research process: 1. Data Reduction - This seems to be an analysis of initial data where one is almost brainstorming on the research question using this data. 2. Data Display - This is where the data becomes more organized in order to bring about conclusions. 3. Conclusion Drawing & Verification - Lastly, the researcher decides what his data means and then verifies those concepts. Throughout the data analysis progression, I think that within my organization steps I would want to have everything in tangible form in front of me. It's how I work now on research papers, and believe it to be the best idea for a real research process - both qualitative and quantitative.
The YouTube video and Slideshare presentation, gave over a few great hints. The YouTube video mentioned a website called counselingtechnology.net, which can be useful in creating surveys. I found it to be a simple interface, and there is free membership sign up for certain professional groups. There was a data analysis program for Excel mentioned a lot in the video called EZ Analyze, which may come in handy at some point. The lecturer discussed three interesting hints for survey data analysis. First, code the paper surveys with identification numbers to help input data later for easy analysis. Second, use a code that also contains information relevant to the survey, such as date or area surveyed. Third, every row within a data analyzing program is reserved for a new person's data - not a new idea to an Excel user like me, rather a verification of a sneaking suspicion. The Slideshow brought up the great question of "Why is sampling important?" This is something I ask myself when reading research that utilizes samplings; why is this sampling important to this study, and was it conducted successfully? According to the lecturer, sampling is used to test hypotheses which usually become 'law-like' in the sense that the sampling allows the researcher to infer certain facts from the sampling about the wider population. This inference requires the smaller sampling to validate the wider population correctly, or the internal validity must validate the external validity. Essentially, this sampling is important in both qualitative and quantitative studies, and I thought the Slideshare really put this concept in perspective.
This week I read two of Creswell's chapters on beginning to write a research paper. The first chapter on writing strategies covered some basics that were fleshed out for a clearer understanding with research design applications. What I found most applicable to me was the idea of writing an outline. I've always found this helpful in my writings, and still utilize this strategy today. Creswell writes, "specify sections early in the design of a proposal. Work on one section will often prompt ideas for other sections. First develop an outline and then write something for each section rapidly, to get ideas down on paper. Then refine the sections as you consider in more detail the information that should go into each one." This method was very similar to Franklin's three state model of first developing an outline, then writing a draft and more ideas surrounding the outline, and lastly editing and polishing. These methods are closest to how I write now, but with clearer steps and will therefore hopefully help me become a better writer in general. The one point in this chapter that I have never heard of before, but is fantastic advice, is the idea of a writing warm-up period as a starting exercise for both the mind and the fingers. Maybe this would be a great way to catch up on emails to friends.
The other Creswell chapter I read this week was about writing introductions. Ultimately, all research designs follow the similar pattern of "announc[ing] a problem and just[ifying] why it needs to be studied". When writing this introduction, Creswell's advice is ensuring that the first sentence is a 'narrative hook'. This is very similar to something my husband has been working on in reference to his songwriting. All songs have a hook, a part of the song that makes it memorable to the audience - so memorable in fact, that it's the part of a song that one would sing when thinking of that song. Although a song hook is usually not the first beats or the first stanza, it encapsulates a similar idea to the narrative hook of an introduction. The deficiencies model is a method of writing two pages of the introductory ideas. It contains five points that I discovered to be succinct and a great model to follow. 1. Write about the research problem. 2. Discuss studies that have addressed this problem. 3. Talk about the deficiencies in the aforementioned studies. 4. Discuss the significance of the study for particular audiences. 5. Write the purpose statement.
The YouTube video and Slideshare presentation, gave over a few great hints. The YouTube video mentioned a website called counselingtechnology.net, which can be useful in creating surveys. I found it to be a simple interface, and there is free membership sign up for certain professional groups. There was a data analysis program for Excel mentioned a lot in the video called EZ Analyze, which may come in handy at some point. The lecturer discussed three interesting hints for survey data analysis. First, code the paper surveys with identification numbers to help input data later for easy analysis. Second, use a code that also contains information relevant to the survey, such as date or area surveyed. Third, every row within a data analyzing program is reserved for a new person's data - not a new idea to an Excel user like me, rather a verification of a sneaking suspicion. The Slideshow brought up the great question of "Why is sampling important?" This is something I ask myself when reading research that utilizes samplings; why is this sampling important to this study, and was it conducted successfully? According to the lecturer, sampling is used to test hypotheses which usually become 'law-like' in the sense that the sampling allows the researcher to infer certain facts from the sampling about the wider population. This inference requires the smaller sampling to validate the wider population correctly, or the internal validity must validate the external validity. Essentially, this sampling is important in both qualitative and quantitative studies, and I thought the Slideshare really put this concept in perspective.
This week I read two of Creswell's chapters on beginning to write a research paper. The first chapter on writing strategies covered some basics that were fleshed out for a clearer understanding with research design applications. What I found most applicable to me was the idea of writing an outline. I've always found this helpful in my writings, and still utilize this strategy today. Creswell writes, "specify sections early in the design of a proposal. Work on one section will often prompt ideas for other sections. First develop an outline and then write something for each section rapidly, to get ideas down on paper. Then refine the sections as you consider in more detail the information that should go into each one." This method was very similar to Franklin's three state model of first developing an outline, then writing a draft and more ideas surrounding the outline, and lastly editing and polishing. These methods are closest to how I write now, but with clearer steps and will therefore hopefully help me become a better writer in general. The one point in this chapter that I have never heard of before, but is fantastic advice, is the idea of a writing warm-up period as a starting exercise for both the mind and the fingers. Maybe this would be a great way to catch up on emails to friends.
The other Creswell chapter I read this week was about writing introductions. Ultimately, all research designs follow the similar pattern of "announc[ing] a problem and just[ifying] why it needs to be studied". When writing this introduction, Creswell's advice is ensuring that the first sentence is a 'narrative hook'. This is very similar to something my husband has been working on in reference to his songwriting. All songs have a hook, a part of the song that makes it memorable to the audience - so memorable in fact, that it's the part of a song that one would sing when thinking of that song. Although a song hook is usually not the first beats or the first stanza, it encapsulates a similar idea to the narrative hook of an introduction. The deficiencies model is a method of writing two pages of the introductory ideas. It contains five points that I discovered to be succinct and a great model to follow. 1. Write about the research problem. 2. Discuss studies that have addressed this problem. 3. Talk about the deficiencies in the aforementioned studies. 4. Discuss the significance of the study for particular audiences. 5. Write the purpose statement.
Wednesday, October 12, 2011
Research Q's, Survey advice, Holistic youth understanding
This week we delved into the topic of literature reviews, which specifically pertains to our upcoming assignment in about one month. There were several bits of relevant and important information I gleaned from the multimedia presentations. The slideshare was particularly useful, explaining that first and foremost a literature review is not simply a review of every piece of literature regarding the topic at hand, rather a compilation of the most vital ones. One technique that I found very useful in defining a topic was the concept that a researcher should read the relevant information, search for inconsistencies in the articles, and then organize the material and information gleaned into something logical. Lastly, the slideshare presented some fantastic methods in which a researcher can go about identifying a great research question: First, the question should sustain the interest of the researcher. Second, the question should stay in the range of the researchers competence. Third, the question should be manageable in size. Fourth, the question should have some sort of basis in a theory.
The video had some great ideas for me to keep in mind as well. In regards to a good research question, it's important to note that it's more about the 'how' and 'why', and not so much about the 'what'. A good question will not have an obvious and quick answer, rather it will demand data analysis. Lastly, a researcher should be careful that their question is not too broad of a topic, in other words, it should be a one-part question. These pieces of advice will be helpful when reading research articles and when writing my literature review. Additionally, it will assist in my later research proposal.
The written lecture for this week contained some great survey advice to keep in mind, specifically in relation to youth. Firstly, interviewing children will produce better data, as opposed to a written survey where a child may need outside assistance which skews the results. Surveys would be better conducted in an informal or semi-structured setting, which will put children more at ease. Personally speaking, when discussing something important with my own kids, they seem to fidget a lot more when I sit them down and 'have a talk', instead of just casually telling them what's important. Lastly, in order to really get children to give over information for a study, the researcher must build up some trust first with the child. This is similar to greeting new dogs that one meets out on the street. My mother always taught me to put out my hand first and let the dog sniff it in order to gain it's trust. Obviously, children are a far cry from dogs, but a similar concept can be learned here.
A few days ago the CDC sent us a random survey, asking us to participate in their data collection of United States children's immunizations. Although I thought it was actually pretty cool to receive this survey from the CDC itself, I'm not sure I'm going to have the time to participate. There certainly was no incentive. However, I'd like to be a part of it, and perhaps analyze their method of asking phone interview questions.
J. Ellis wrote an amazing article that really pinpoints great methods of understanding a child holistically, which will eventually lead to a better study with those youth. Similar to the lecture, Ellis discusses the benefits of a pre-interview, essentially a method that builds a vital trust between the child and the researcher. One piece of advice she mentions that I thought was very clever, was the idea that a seasoned interviewer goes into the interview with a vague sense of what they know they would like to ask and accomplish, and improvises. However, newer interviewers should start with practicing with actual questions. This reminded me of the interview I conducted for my 204 class. I wrote down a list of specific questions, and although was told to kind of improvise, I found myself struggling at certain points. At those times, I would look down again at my paper to reposition my interview. Another point that Ellis talks about is the idea that interview questions should be open-ended. This idea is not new to me, as I learned about in my reference class when learning about the reference interview process. Again this concept reared it's head in my archives class I'm taking currently, this time when a researcher comes in and the archivist must conduct a type of reference interview. I'm always fascinated when this type of overlap between classes occur.
The video had some great ideas for me to keep in mind as well. In regards to a good research question, it's important to note that it's more about the 'how' and 'why', and not so much about the 'what'. A good question will not have an obvious and quick answer, rather it will demand data analysis. Lastly, a researcher should be careful that their question is not too broad of a topic, in other words, it should be a one-part question. These pieces of advice will be helpful when reading research articles and when writing my literature review. Additionally, it will assist in my later research proposal.
The written lecture for this week contained some great survey advice to keep in mind, specifically in relation to youth. Firstly, interviewing children will produce better data, as opposed to a written survey where a child may need outside assistance which skews the results. Surveys would be better conducted in an informal or semi-structured setting, which will put children more at ease. Personally speaking, when discussing something important with my own kids, they seem to fidget a lot more when I sit them down and 'have a talk', instead of just casually telling them what's important. Lastly, in order to really get children to give over information for a study, the researcher must build up some trust first with the child. This is similar to greeting new dogs that one meets out on the street. My mother always taught me to put out my hand first and let the dog sniff it in order to gain it's trust. Obviously, children are a far cry from dogs, but a similar concept can be learned here.
A few days ago the CDC sent us a random survey, asking us to participate in their data collection of United States children's immunizations. Although I thought it was actually pretty cool to receive this survey from the CDC itself, I'm not sure I'm going to have the time to participate. There certainly was no incentive. However, I'd like to be a part of it, and perhaps analyze their method of asking phone interview questions.
J. Ellis wrote an amazing article that really pinpoints great methods of understanding a child holistically, which will eventually lead to a better study with those youth. Similar to the lecture, Ellis discusses the benefits of a pre-interview, essentially a method that builds a vital trust between the child and the researcher. One piece of advice she mentions that I thought was very clever, was the idea that a seasoned interviewer goes into the interview with a vague sense of what they know they would like to ask and accomplish, and improvises. However, newer interviewers should start with practicing with actual questions. This reminded me of the interview I conducted for my 204 class. I wrote down a list of specific questions, and although was told to kind of improvise, I found myself struggling at certain points. At those times, I would look down again at my paper to reposition my interview. Another point that Ellis talks about is the idea that interview questions should be open-ended. This idea is not new to me, as I learned about in my reference class when learning about the reference interview process. Again this concept reared it's head in my archives class I'm taking currently, this time when a researcher comes in and the archivist must conduct a type of reference interview. I'm always fascinated when this type of overlap between classes occur.
Wednesday, October 5, 2011
"Beginner's Mind"
This week I began to delve into the book related to researching youth, Representing Youth. This piece of literature looks like it will present a clearer understanding of youth and how to best approach research when dealing with that age group. One key point I'd like to remember from the book is the idea of a 'beginner's mind' - "which involves suspending preconceived notions and recognizing different conceptual locations when we enter into the research relationship, to meet participants on their own terms and to understand their locations". Brilliant. Similar to a tabula rasa, with an added awareness of youth. The author included a variety of opinions and viewpoints in this first chapter, yet concluded with this 'beginners mind' concept which reminded me a little of the proposed set up for the upcoming integrated literature review assignment. I appreciated how both sides of the coin were discussed throughout the chapter, and one piece of advice I especially liked was the idea that certain perspectives on dealing with children should not be considered blueprints, rather as guidelines. Additionally, I noticed how the author specifically delineated teenagers as an integral youth sub-group worth consideration, however did not seem to focus on other youth sub-groups, such as toddlers, or tweens. Lastly, the teens materials class I took last semester came to mind, as I recalled the 'teen brain' where the brains are not as completely developed as adults in certain areas. Despite this potential handicap, I do believe that youth of all ages have different strengths that adults have lost over time. I hope that if I were to be working with youth in research in the future, pointers such as these will assist me greatly in 'representing youth'.
The Creswell reading on mixed methods research design was somewhat of an overview of qualitative and quantitative methods, with added facts of the integration of these designs. I was surprised to learn that this is a relatively new approach, and therefore many of these integrations have yet to be implemented, or at the very least, published in journals. One of my group members recently abstracted a "qualitative/quantitative" study. I thought perhaps this should be referred to as a mixed methods study, but she told me that this was how the authors dubbed the research. Perhaps this is because of the newness of the mixed methods design.
The literature review assignment is due in about a month. I've already saved many articles I've found on my subject of library summer reading programs. Based on this week's video lecture, it seems that this is essentially a research paper where a query is analyzed and discussed through peer-reviewed journal articles, or other scholarly literature works. My 200 class really prepared me well for finding these scholarly articles as it was part of the curriculum, and as this class progresses I'm finding myself more and more thankful for that early instruction. Also, I'm interested in discovering what "mind-mapping" is all about through the suggested site: http://www.mindmeister.com/
General Notes:
I just discovered that Joanne's dissertation topic was the same that I have chosen to abstract. This fills me with some trepidation as I hope to live up to high standards in a world with a plethora of information on any one topic.
Since the literature review due date is coming up, I must remember to put some books on loan that I've mined from some of my journal articles.
Other mined pieces that have older dates seem to be non-existent in the King Library database system. I'm hoping to either find them through Google Scholar or through my own LAPL databases. However, I've discovered that finding the correct database on a less academic system like LAPL, is not as simple as I initially thought.
The Creswell reading on mixed methods research design was somewhat of an overview of qualitative and quantitative methods, with added facts of the integration of these designs. I was surprised to learn that this is a relatively new approach, and therefore many of these integrations have yet to be implemented, or at the very least, published in journals. One of my group members recently abstracted a "qualitative/quantitative" study. I thought perhaps this should be referred to as a mixed methods study, but she told me that this was how the authors dubbed the research. Perhaps this is because of the newness of the mixed methods design.
The literature review assignment is due in about a month. I've already saved many articles I've found on my subject of library summer reading programs. Based on this week's video lecture, it seems that this is essentially a research paper where a query is analyzed and discussed through peer-reviewed journal articles, or other scholarly literature works. My 200 class really prepared me well for finding these scholarly articles as it was part of the curriculum, and as this class progresses I'm finding myself more and more thankful for that early instruction. Also, I'm interested in discovering what "mind-mapping" is all about through the suggested site: http://www.mindmeister.com/
General Notes:
I just discovered that Joanne's dissertation topic was the same that I have chosen to abstract. This fills me with some trepidation as I hope to live up to high standards in a world with a plethora of information on any one topic.
Since the literature review due date is coming up, I must remember to put some books on loan that I've mined from some of my journal articles.
Other mined pieces that have older dates seem to be non-existent in the King Library database system. I'm hoping to either find them through Google Scholar or through my own LAPL databases. However, I've discovered that finding the correct database on a less academic system like LAPL, is not as simple as I initially thought.
Sunday, October 2, 2011
Mnemonic for Qualitative vs Quantitative
Qualitative research design and quantitative research design are very different from each other. The key is to remember which one is which, especially because the names are so similar. The most efficient way that I've found to differentiate between the two is through a simple mnemonic method. "Quantity" refers to a number, as a reminder for the quantitative research design since this method includes a lot of counting and statistics. Once I remember that, it becomes easier for me to apply this method to many of the research designs I've been exposed to as a younger student, which were mostly quantitative in nature.
Recently, I recalled being part of a focus group, which is a type of qualitative research method. My family and I were spending the day at Disneyland, when we were approached by a Disney employee and asked if one of the adults in our group - in other words, either my husband or I - would like to participate in a focus group that afternoon. Their incentive was $100 cash on the spot, which essentially paid for most of our visit that day! I assented, and attended an hour long casual question and answer forum. There were about twenty participants in my group, and a Disney employee group leader asked questions like 'what do we think could improve about the food offered in their parks'. The focus group incentive reminded me of the idea that the IRB training mentioned; that the incentive should be just enough in order to interest the participant, yet simultaneously not too much in order to allow the participant to give honest answers. When paying a lot of money for one ticket to Disneyland, the incentive was just enough to entice me to leave my family for an hour in order to attend and give my honest opinion.
As I was reading the chapter in the Creswell book about quantitative methods, I found myself remembering my days as a student in Santa Monica College, taking a mandatory class in statistics. Although it was a basic class, and I passed without much of a problem, the last few chapters of the text I found extremely complicated. However, in order to receive a Psychology degree from California State University of Northridge, statistics was a requirement. Once I was in full psych mode at CSUN, I discovered that statistics truly plays a large role in the quantitative research design. Although I like the definite proof that numbers provide, even with the fact that there is always some level of ambiguity with +/- results; statistical analysis is certainly not my strongest point. In fact, the weekly lecture notates that most scientists are not statisticians, since statistics is ultimately a specialized field. I wonder how many quantitative researchers are able to close this knowledge gap.
Lastly, an interesting point that I gleaned from this week's lecture notes was that survey research can refer to both quantitative and qualitative research design, although in most cases it's considered a quantitative method. If the survey is viewed or utilized as a substitute for an interview, which is qualitative in nature, this fact truly makes sense.
Recently, I recalled being part of a focus group, which is a type of qualitative research method. My family and I were spending the day at Disneyland, when we were approached by a Disney employee and asked if one of the adults in our group - in other words, either my husband or I - would like to participate in a focus group that afternoon. Their incentive was $100 cash on the spot, which essentially paid for most of our visit that day! I assented, and attended an hour long casual question and answer forum. There were about twenty participants in my group, and a Disney employee group leader asked questions like 'what do we think could improve about the food offered in their parks'. The focus group incentive reminded me of the idea that the IRB training mentioned; that the incentive should be just enough in order to interest the participant, yet simultaneously not too much in order to allow the participant to give honest answers. When paying a lot of money for one ticket to Disneyland, the incentive was just enough to entice me to leave my family for an hour in order to attend and give my honest opinion.
As I was reading the chapter in the Creswell book about quantitative methods, I found myself remembering my days as a student in Santa Monica College, taking a mandatory class in statistics. Although it was a basic class, and I passed without much of a problem, the last few chapters of the text I found extremely complicated. However, in order to receive a Psychology degree from California State University of Northridge, statistics was a requirement. Once I was in full psych mode at CSUN, I discovered that statistics truly plays a large role in the quantitative research design. Although I like the definite proof that numbers provide, even with the fact that there is always some level of ambiguity with +/- results; statistical analysis is certainly not my strongest point. In fact, the weekly lecture notates that most scientists are not statisticians, since statistics is ultimately a specialized field. I wonder how many quantitative researchers are able to close this knowledge gap.
Lastly, an interesting point that I gleaned from this week's lecture notes was that survey research can refer to both quantitative and qualitative research design, although in most cases it's considered a quantitative method. If the survey is viewed or utilized as a substitute for an interview, which is qualitative in nature, this fact truly makes sense.
Subscribe to:
Posts (Atom)