Hi everyone,
Need help with analysing this data. A description of the data: Individuals are biannually (or annually if they are high risk) invited to a screening program until the age of 70. After the age of 70 individuals can participate into the program but through self-referral and its free of charge. When individuals participate in the program some demographic and other information is collected during each participation round. The data that we have covers 6-year period (January 2011-December 2017) only.
I am interested in knowing what factors are associated with continuing to participate in the program (self-referral, free of charge) after reaching the age limit of 70. The outcome variable is binary.
The data include 1) individuals who might haven’t yet reached the age of 70 during the end of the study data to see if they will continue to use the program or not and 2)individuals who are above the age of 70. These individuals will need to be excluded before doing the analysis. Any suggestions on how to exclude these participants?
I was thinking to use xtlogit. Not sure if this is the right model?
Thanks in advance.
Example of data.
input int id str9(dob participation_date) byte(age smoke activity alcohol)
2 "13-Nov-44" "5-Dec-12" 68 0 0 0
2 "13-Nov-44" "7-May-15" 70 0 0 0
2 "13-Nov-44" "9-Jun-17" 72 0 0 0
4 "14-Feb-44" "28-Nov-11" 67 0 1 0
4 "14-Feb-44" "22-Oct-12" 68 0 1 0
4 "14-Feb-44" "25-Oct-13" 69 0 1 0
4 "14-Feb-44" "10-Nov-14" 70 0 1 0
4 "14-Feb-44" "14-Dec-15" 71 0 1 0
4 "14-Feb-44" "12-Dec-16" 72 0 1 0
5 "31-Jul-38" "25-Jan-12" 73 0 0 0
5 "31-Jul-38" "8-Oct-13" 75 0 0 0
5 "31-Jul-38" "12-Feb-16" 77 0 0 0
12 "20-Jan-36" "16-May-12" 76 0 0 0
12 "20-Jan-36" "22-May-14" 78 0 0 0
14 "26-Dec-43" "17-Oct-12" 68 0 0 0
15 "13-May-36" "12-Mar-13" 76 0 0 0
15 "13-May-36" "29-Apr-15" 78 0 0 0
17 "16-Mar-40" "22-Jun-12" 72 0 0 0
17 "16-Mar-40" "4-Aug-14" 74 0 0 0
20 "1-Nov-41" "29-Jan-13" 71 0 0 0
20 "1-Nov-41" "4-Mar-15" 73 0 0 0
20 "1-Nov-41" "31-Mar-17" 75 0 0 0
21 "10-Oct-42" "28-Jan-15" 72 0 0 0
23 "8-Mar-40" "28-Feb-12" 71 0 0 0
23 "8-Mar-40" "3-Jun-14" 74 0 0 0
25 "1-Jun-39" "4-Feb-13" 73 0 0 0
25 "1-Jun-39" "17-Jul-15" 76 0 0 0
27 "3-Mar-34" "30-Jul-13" 79 0 0 0
27 "3-Mar-34" "26-May-15" 81 0 0 0
28 "29-Sep-43" "29-Oct-12" 69 0 0 0
28 "29-Sep-43" "12-Nov-14" 71 0 0 0
28 "29-Sep-43" "16-Dec-16" 73 0 0 0
33 "11-Jan-42" "24-Oct-11" 69 0 1 0
33 "11-Jan-42" "1-Dec-14" 72 0 1 0
33 "11-Jan-42" "12-Apr-16" 74 0 1 0
35 "16-Aug-36" "18-Dec-12" 76 0 0 0
40 "12-Aug-36" "5-Oct-11" 75 0 0 0
40 "12-Aug-36" "31-Oct-13" 77 0 0 0
40 "12-Aug-36" "13-Apr-16" 79 0 0 0
50 "7-Jul-28" "17-Apr-12" 83 1 1 0
50 "7-Jul-28" "7-May-13" 84 1 1 0
50 "7-Jul-28" "1-May-14" 85 1 1 0
50 "7-Jul-28" "23-Jun-15" 86 1 1 0
51 "25-Apr-41" "4-Apr-14" 72 0 0 0
51 "25-Apr-41" "2-May-16" 75 0 0 0
52 "5-Aug-43" "19-Aug-11" 68 0 0 0
52 "5-Aug-43" "1-Sep-15" 72 0 0 0
54 "13-Nov-40" "12-May-14" 73 0 0 0
55 "4-Mar-42" "9-Jul-14" 72 0 0 0
55 "4-Mar-42" "23-Sep-16" 74 0 0 0
56 "15-Feb-46" "5-Nov-12" 66 0 0 0
56 "15-Feb-46" "20-Oct-14" 68 0 0 0
56 "15-Feb-46" "21-Oct-16" 70 0 0 0
58 "18-Aug-46" "18-Apr-13" 66 0 0 0
58 "18-Aug-46" "28-Nov-14" 68 0 0 0
58 "18-Aug-46" "16-Dec-16" 70 0 0 0
64 "14-Jul-40" "8-Apr-14" 73 0 0 0
64 "14-Jul-40" "31-Mar-17" 76 0 0 0
65 "11-Feb-28" "22-Aug-14" 86 1 0 0
65 "11-Feb-28" "21-Aug-15" 87 1 0 0
65 "11-Feb-28" "17-Aug-16" 88 1 0 0
67 "24-Jan-22" "9-Aug-12" 90 0 0 0
68 "10-Jun-42" "23-Jan-12" 69 0 0 0
68 "10-Jun-42" "16-Jan-14" 71 0 0 0
68 "10-Jun-42" "12-Jan-16" 73 0 0 0
74 "14-Apr-34" "6-Aug-12" 78 0 0 0
76 "24-Sep-40" "14-Feb-12" 71 0 0 0
76 "24-Sep-40" "6-Dec-13" 73 0 0 0
76 "24-Sep-40" "26-Nov-15" 75 0 0 0
79 "16-Feb-33" "16-Mar-12" 79 0 0 0
79 "16-Feb-33" "23-Apr-14" 81 0 0 0
79 "16-Feb-33" "12-Apr-16" 83 0 0 0
80 "25-Jan-35" "2-Dec-16" 81 0 0 0
83 "30-Dec-37" "6-Jun-13" 75 0 0 0
85 "25-Nov-35" "18-Oct-12" 76 0 0 0
88 "3-Dec-35" "14-Mar-13" 77 0 0 0
92 "7-Oct-36" "8-Aug-13" 76 0 0 0
94 "11-Feb-43" "28-Apr-14" 71 0 0 0
94 "11-Feb-43" "20-Jul-16" 73 0 0 0
98 "16-Dec-40" "14-Jan-13" 72 0 0 0
98 "16-Dec-40" "28-Nov-14" 73 0 0 0
98 "16-Dec-40" "20-Feb-17" 76 0 0 0
99 "11-Jun-36" "27-Jan-12" 75 0 0 0
99 "11-Jun-36" "4-Mar-14" 77 0 0 0
99 "11-Jun-36" "3-Mar-16" 79 0 0 0
101 "20-Oct-34" "4-Apr-14" 79 0 0 0
103 "13-Feb-38" "8-Feb-12" 73 0 0 0
103 "13-Feb-38" "11-Mar-14" 76 0 0 0
103 "13-Feb-38" "3-Mar-17" 79 0 0 0
106 "11-Dec-37" "21-Nov-11" 73 0 0 0
106 "11-Dec-37" "19-Nov-13" 75 0 0 0
106 "11-Dec-37" "11-Sep-15" 77 0 0 0
109 "27-Aug-42" "18-Nov-11" 69 0 1 0
109 "27-Aug-42" "23-Apr-13" 70 0 1 0
109 "27-Aug-42" "29-May-14" 71 0 1 0
109 "27-Aug-42" "7-Jul-15" 72 0 1 0
109 "27-Aug-42" "11-Aug-16" 73 0 1 0
Related Posts with Panel data
Paasche indexI have the dataframe like: year Pa Qa Pb Qb Pc Qc ... 1 2 3 ... How do I create the Paasche index …
storing the 2.5% and 97.5% centiles using statsbyHi, The below code gives me exactly what I need. I wonder if there is any way to store the 2.5% and …
Code with a loop - Jones ModelHello, Please help! We have a dataset and we need a code with a loop that will run statistics and re…
spmap: displaying country name and labels (values) when using spmap commandHello, It is my first time using "spmap" command to produce maps. I am working on a panel of a numbe…
How to Plot panel data as a time series ????Dear Statilist May it is very simple but still I cannot find my answer I have panel data for 29 co…
Subscribe to:
Post Comments (Atom)
0 Response to Panel data
Post a Comment