Friday, April 30, 2021

Individual line colours in pcspike

Hello,

I'm trying to create a parallel axis dot plot where each line has a different colour (I've managed to get everything else I want). The current one I have built has all the lines as one colour, navy (I must admit I usually don't favour multiple colours as they can become confusing, but in this case there aren't excessive observations). I've attached an example of my dataset, the code I've been using to generate the plot and a .png attachment of what it looks like so far.

I'm using Stata version 16.1 on macOS Big Sur version 11.2.3.

This is an example of my dataset

Code:
* Example generated by -dataex-. For more info, type help dataex
clear
input double F byte CaN double G byte CaoN
 3.4 0 10.4 1
 5.8 0   18 1
 5.8 0 15.1 1
 6.4 0 20.4 1
   8 0 13.3 1
12.5 0   17 1
13.5 0 18.3 1
 7.1 0 18.3 1
17.8 0   22 1
 2.8 0 20.9 1
   6 0 12.9 1
13.3 0 27.1 1
 9.6 0 23.2 1
 4.3 0 11.8 1
 9.5 0 16.8 1
   2 0  8.4 1
 4.7 0 12.8 1
 6.4 0 15.9 1
 2.9 0   24 1
   3 0 13.4 1
 7.1 0 31.5 1
 6.2 0 24.7 1
 2.1 0   26 1
 4.6 0 19.7 1
 2.2 0 41.4 1
 1.9 0 14.5 1
11.2 0 18.3 1
13.3 0 20.4 1
 4.3 0 13.6 1
12.7 0   16 1
end
The code I have used to generate my plot is below:

Code:
twoway pcspike F CaN G CaoN, title(Parallel axis dot plot) || scatter F CaN, msym(O) pstyle(p2) || scatter G CaoN, msym(O) pstyle(p4) ytitle(EGM Amplitude (mV))  xla( ""  "",notick) legend(order(2 "Cathodal-anodal capture" 3 "Cathodal or anodal capture"))
Thank you very much for your help,

Don

Line Graph in Panel Dataset

Hi! I wanted to how to create a single line graph for a panel dataset? The dataset looks something like this:

Country Year GDP
A 2000 ...
A 2001 ,,,,
A 2002 ,,,
B 2000 ....
B 2001 ....
B 2002 ....
C 2000 ..
C 2001 ,,,
C 2002 ,...
I wanted to have a single line which takes all countries into one - not separate line graphs. GDP would be on the y axis and year on the x axis.

Creating a treatment for DID analysis: how to do it the right?

Hi all. I have panel data containing the number of Brazilian institutions of higher education (variable "n_he_inst") across more than 5500 municipalities for 5 years (2002, 2006, 2010, 2014 and 2018). I would like to use a difference-in-differences design to analyze the impact of having a higher ed institution in a municipality on five dependent variables.

1) I'm not sure how to create the treatment variable. Do I have to code "1" for municipalities/years with one or more universities and code "zero" for municipalities/years without universities?

2) What else do I need in my dataset to conduct the DID analysis?

Part of the data follows below.

Variables:

year
ibge_mun_code : municipality's ID
NOME_MUNICIPIO: municipality's name
n_he_inst : number of higher education institutions in a municipality

Thank you


Code:
* Example generated by -dataex-. For more info, type help dataex
clear
input double year long ibge_mun_code str33 NOME_MUNICIPIO float n_he_inst
2002 1507953 "TAILANDIA"                  0
2006 1507953 "TAIL�NDIA"                  0
2010 1507953 "TAIL�NDIA"                  0
2014 1507953 "TAIL�NDIA"                  0
2018 1507953 "TAIL�NDIA"                  1
2002 1507961 "TERRA ALTA"                 0
2006 1507961 "TERRA ALTA"                 0
2010 1507961 "TERRA ALTA"                 0
2014 1507961 "TERRA ALTA"                 0
2018 1507961 "TERRA ALTA"                 0
2002 1507979 "TERRA SANTA"                0
2006 1507979 "TERRA SANTA"                0
2010 1507979 "TERRA SANTA"                0
2014 1507979 "TERRA SANTA"                0
2018 1507979 "TERRA SANTA"                0
2002 1508001 "TOME ACU"                   0
2006 1508001 "TOM�-A�U"                   0
2010 1508001 "TOM�-A�U"                   0
2014 1508001 "TOM�-A�U"                   0
2018 1508001 "TOM�-A�U"                   0
2002 1508035 "TRACUATEUA"                 0
2006 1508035 "TRACUATEUA"                 0
2010 1508035 "TRACUATEUA"                 0
2014 1508035 "TRACUATEUA"                 0
2018 1508035 "TRACUATEUA"                 0
2002 1508050 "TRAIRAO"                    0
2006 1508050 "TRAIR�O"                    0
2010 1508050 "TRAIR�O"                    0
2014 1508050 "TRAIR�O"                    0
2018 1508050 "TRAIR�O"                    0
2002 1508084 "TUCUMA"                     0
2006 1508084 "TUCUM�"                     0
2010 1508084 "TUCUM�"                     0
2014 1508084 "TUCUM�"                     0
2018 1508084 "TUCUM�"                     0
2002 1508100 "TUCURUI"                    1
2006 1508100 "TUCURU�"                    1
2010 1508100 "TUCURU�"                    1
2014 1508100 "TUCURU�"                    1
2018 1508100 "TUCURU�"                    2
2002 1508126 "ULIANOPOLIS"                0
2006 1508126 "ULIAN�POLIS"                0
2010 1508126 "ULIAN�POLIS"                0
2014 1508126 "ULIAN�POLIS"                0
2018 1508126 "ULIAN�POLIS"                0
2002 1508159 "URUARA"                     0
2006 1508159 "URUAR�"                     0
2010 1508159 "URUAR�"                     0
2014 1508159 "URUAR�"                     0
2018 1508159 "URUAR�"                     0
2002 1508209 "VIGIA"                      0
2006 1508209 "VIGIA"                      0
2010 1508209 "VIGIA"                      0
2014 1508209 "VIGIA"                      0
2018 1508209 "VIGIA"                      0
2002 1508308 "VIZEU"                      0
2006 1508308 "VISEU"                      0
2010 1508308 "VISEU"                      0
2014 1508308 "VISEU"                      0
2018 1508308 "VISEU"                      0
2002 1508357 "VITORIA DO XINGU"           0
2006 1508357 "VIT�RIA DO XINGU"           0
2010 1508357 "VIT�RIA DO XINGU"           0
2014 1508357 "VIT�RIA DO XINGU"           0
2018 1508357 "VIT�RIA DO XINGU"           0
2002 1508407 "XINGUARA"                   0
2006 1508407 "XINGUARA"                   0
2010 1508407 "XINGUARA"                   0
2014 1508407 "XINGUARA"                   0
2018 1508407 "XINGUARA"                   0
2002 1600055 "SERRA DO NAVIO"             0
2006 1600055 "SERRA DO NAVIO"             0
2010 1600055 "SERRA DO NAVIO"             0
2014 1600055 "SERRA DO NAVIO"             0
2018 1600055 "SERRA DO NAVIO"             0
2002 1600105 "AMAPA"                      0
2006 1600105 "AMAP�"                      0
2010 1600105 "AMAP�"                      0
2014 1600105 "AMAP�"                      0
2018 1600105 "AMAP�"                      0
2002 1600154 "AMAPARI"                    0
2006 1600154 "AMAPARI"                    0
2010 1600154 "�GUA BRANCA DO AMAPARI"     0
2014 1600154 "PEDRA BRANCA DO AMAPARI"    0
2018 1600154 "PEDRA BRANCA DO AMAPARI"    0
2002 1600204 "CALCOENE"                   0
2006 1600204 "CAL�OENE"                   0
2010 1600204 "CAL�OENE"                   0
2014 1600204 "CAL�OENE"                   0
2018 1600204 "CAL�OENE"                   0
2002 1600212 "CUTIAS"                     0
2006 1600212 "CUTIAS"                     0
2010 1600212 "CUTIAS"                     0
2014 1600212 "CUTIAS"                     0
2018 1600212 "CUTIAS"                     0
2002 1600238 "FERREIRA GOMES"             0
2006 1600238 "FERREIRA GOMES"             0
2010 1600238 "FERREIRA GOMES"             0
2014 1600238 "FERREIRA GOMES"             0
2018 1600238 "FERREIRA GOMES"             0
2002 1600253 "ITAUBAL"                    0
2006 1600253 "ITAUBAL"                    0
2010 1600253 "ITAUBAL"                    0
2014 1600253 "ITAUBAL"                    0
2018 1600253 "ITAUBAL"                    0
2002 1600279 "LARANJAL DO JARI"           0
2006 1600279 "LARANJAL DO JAR�"           0
2010 1600279 "LARANJAL DO JARI"           0
2014 1600279 "LARANJAL DO JARI"           0
2018 1600279 "LARANJAL DO JARI"           0
2002 1600303 "MACAPA"                     6
2006 1600303 "MACAP�"                    11
2010 1600303 "MACAP�"                    14
2014 1600303 "MACAP�"                    15
2018 1600303 "MACAP�"                    14
2002 1600402 "MAZAGAO"                    0
2006 1600402 "MAZAG�O"                    0
2010 1600402 "MAZAG�O"                    0
2014 1600402 "MAZAG�O"                    0
2018 1600402 "MAZAG�O"                    0
2002 1600501 "OIAPOQUE"                   0
2006 1600501 "OIAPOQUE"                   0
2010 1600501 "OIAPOQUE"                   0
2014 1600501 "OIAPOQUE"                   0
2018 1600501 "OIAPOQUE"                   0
2002 1600535 "PORTO GRANDE"               0
2006 1600535 "PORTO GRANDE"               0
2010 1600535 "PORTO GRANDE"               0
2014 1600535 "PORTO GRANDE"               0
2018 1600535 "PORTO GRANDE"               0
2002 1600550 "PRACUUBA"                   0
2006 1600550 "PRACU�BA"                   0
2010 1600550 "PRACU�BA"                   0
2014 1600550 "PRACU�BA"                   0
2018 1600550 "PRACU�BA"                   0
2002 1600600 "SANTANA"                    0
2006 1600600 "SANTANA"                    1
2010 1600600 "SANTANA"                    1
2014 1600600 "SANTANA"                    1
2018 1600600 "SANTANA"                    1
2002 1600709 "TARTARUGALZINHO"            0
2006 1600709 "TARTARUGALZINHO"            0
2010 1600709 "TARTARUGALZINHO"            0
2014 1600709 "TARTARUGALZINHO"            0
2018 1600709 "TARTARUGALZINHO"            0
2002 1600808 "VITORIA DO JARI"            0
2006 1600808 "VIT�RIA DO JAR�"            0
2010 1600808 "VIT�RIA DO JARI"            0
2014 1600808 "VIT�RIA DO JARI"            0
2018 1600808 "VIT�RIA DO JARI"            0
2002 1700251 "ABREULANDIA"                0
2006 1700251 "ABREUL�NDIA"                0
2010 1700251 "ABREUL�NDIA"                0
2014 1700251 "ABREUL�NDIA"                0
2018 1700251 "ABREUL�NDIA"                0
2002 1700301 "AGUIARNOPOLIS"              0
2006 1700301 "AGUIARN�POLIS"              0
2010 1700301 "AGUIARN�POLIS"              0
2014 1700301 "AGUIARN�POLIS"              0
2018 1700301 "AGUIARN�POLIS"              0
2002 1700350 "ALIANCA DO TOCANTINS"       0
2006 1700350 "ALIAN�A DO TOCANTINS"       0
2010 1700350 "ALIAN�A DO TOCANTINS"       0
2014 1700350 "ALIAN�A DO TOCANTINS"       0
2018 1700350 "ALIAN�A DO TOCANTINS"       0
2002 1700400 "ALMAS"                      0
2006 1700400 "ALMAS"                      0
2010 1700400 "ALMAS"                      0
2014 1700400 "ALMAS"                      0
2018 1700400 "ALMAS"                      0
2002 1700707 "ALVORADA"                   0
2006 1700707 "ALVORADA"                   0
2010 1700707 "ALVORADA"                   0
2014 1700707 "ALVORADA"                   0
2018 1700707 "ALVORADA"                   0
2002 1701002 "ANANAS"                     0
2006 1701002 "ANAN�S"                     0
2010 1701002 "ANAN�S"                     0
2014 1701002 "ANAN�S"                     0
2018 1701002 "ANAN�S"                     0
2002 1701051 "ANGICO"                     0
2006 1701051 "ANGICO"                     0
2010 1701051 "ANGICO"                     0
2014 1701051 "ANGICO"                     0
2018 1701051 "ANGICO"                     0
2002 1701101 "APARECIDA DO RIO NEGRO"     0
2006 1701101 "APARECIDA DO RIO NEGRO"     0
2010 1701101 "APARECIDA DO RIO NEGRO"     0
2014 1701101 "APARECIDA DO RIO NEGRO"     0
2018 1701101 "APARECIDA DO RIO NEGRO"     0
2002 1701309 "ARAGOMINAS"                 0
2006 1701309 "ARAGOMINAS"                 0
2010 1701309 "ARAGOMINAS"                 0
2014 1701309 "ARAGOMINAS"                 0
2018 1701309 "ARAGOMINAS"                 0
2002 1701903 "ARAGUACEMA"                 0
2006 1701903 "ARAGUACEMA"                 0
2010 1701903 "ARAGUACEMA"                 0
2014 1701903 "ARAGUACEMA"                 0
2018 1701903 "ARAGUACEMA"                 0
2002 1702000 "ARAGUACU"                   0
2006 1702000 "ARAGUA�U"                   0
2010 1702000 "ARAGUA�U"                   0
2014 1702000 "ARAGUA�U"                   0
2018 1702000 "ARAGUA�U"                   0
2002 1702109 "ARAGUAINA"                  8
2006 1702109 "ARAGUA�NA"                 12
2010 1702109 "ARAGUA�NA"                 12
2014 1702109 "ARAGUA�NA"                  3
2018 1702109 "ARAGUA�NA"                  3
2002 1702158 "ARAGUANA"                   0
2006 1702158 "ARAGUAN�"                   0
2010 1702158 "ARAGUAN�"                   0
2014 1702158 "ARAGUAN�"                   0
2018 1702158 "ARAGUAN�"                   0
2002 1702208 "ARAGUATINS"                 0
2006 1702208 "ARAGUATINS"                 1
2010 1702208 "ARAGUATINS"                 1
2014 1702208 "ARAGUATINS"                 1
2018 1702208 "ARAGUATINS"                 1
2002 1702307 "ARAPOEMA"                   0
2006 1702307 "ARAPOEMA"                   0
2010 1702307 "ARAPOEMA"                   0
2014 1702307 "ARAPOEMA"                   0
2018 1702307 "ARAPOEMA"                   0
2002 1702406 "ARRAIAS"                    0
2006 1702406 "ARRAIAS"                    0
2010 1702406 "ARRAIAS"                    0
2014 1702406 "ARRAIAS"                    0
2018 1702406 "ARRAIAS"                    0
2002 1702554 "AUGUSTINOPOLIS"             0
2006 1702554 "AUGUSTIN�POLIS"             0
2010 1702554 "AUGUSTIN�POLIS"             0
2014 1702554 "AUGUSTIN�POLIS"             1
2018 1702554 "AUGUSTIN�POLIS"             1
2002 1702703 "AURORA DO TOCANTINS"        0
2006 1702703 "AURORA DO TOCANTINS"        0
2010 1702703 "AURORA DO TOCANTINS"        0
2014 1702703 "AURORA DO TOCANTINS"        0
2018 1702703 "AURORA DO TOCANTINS"        0
2002 1702901 "AXIXA DO TOCANTINS"         0
2006 1702901 "AXIX� DO TOCANTINS"         0
2010 1702901 "AXIX� DO TOCANTINS"         0
2014 1702901 "AXIX� DO TOCANTINS"         0
2018 1702901 "AXIX� DO TOCANTINS"         0
2002 1703008 "BABACULANDIA"               0
2006 1703008 "BABA�UL�NDIA"               0
2010 1703008 "BABA�UL�NDIA"               0
2014 1703008 "BABA�UL�NDIA"               0
2018 1703008 "BABA�UL�NDIA"               0
2002 1703057 "BANDEIRANTE DO TOCANTINS"   0
2006 1703057 "BANDEIRANTE DO TOCANTINS"   0
2010 1703057 "BANDEIRANTES DO TOCANTINS"  0
2014 1703057 "BANDEIRANTES DO TOCANTINS"  0
2018 1703057 "BANDEIRANTES DO TOCANTINS"  0
2002 1703073 "BARRA DO OURO"              0
2006 1703073 "BARRA DO OURO"              0
2010 1703073 "BARRA DO OURO"              0
2014 1703073 "BARRA DO OURO"              0
2018 1703073 "BARRA DO OURO"              0
2002 1703107 "BARROLANDIA"                0
2006 1703107 "BARROL�NDIA"                0
2010 1703107 "BARROL�NDIA"                0
2014 1703107 "BARROL�NDIA"                0
2018 1703107 "BARROL�NDIA"                0
2002 1703206 "BERNARDO SAYAO"             0
2006 1703206 "BERNARDO SAY�O"             0
2010 1703206 "BERNARDO SAY�O"             0
2014 1703206 "BERNARDO SAY�O"             0
2018 1703206 "BERNARDO SAY�O"             0
2002 1703305 "BOM JESUS DO TOCANTINS"     0
2006 1703305 "BOM JESUS DO TOCANTINS"     0
2010 1703305 "BOM JESUS DO TOCANTINS"     0
2014 1703305 "BOM JESUS DO TOCANTINS"     0
2018 1703305 "BOM JESUS DO TOCANTINS"     0
2002 1703602 "BRASILANDIA DO TOCANTINS"   0
2006 1703602 "BRASIL�NDIA DO TOCANTINS"   0
2010 1703602 "BRASIL�NDIA DO TOCANTINS"   0
2014 1703602 "BRASIL�NDIA DO TOCANTINS"   0
2018 1703602 "BRASIL�NDIA DO TOCANTINS"   0
2002 1703701 "BREJINHO DE NAZARE"         0
2006 1703701 "BREJINHO DE NAZAR�"         0
2010 1703701 "BREJINHO DE NAZAR�"         0
2014 1703701 "BREJINHO DE NAZAR�"         0
2018 1703701 "BREJINHO DE NAZAR�"         0
2002 1703800 "BURITI DO TOCANTINS"        0
2006 1703800 "BURITI DO TOCANTINS"        0
2010 1703800 "BURITI DO TOCANTINS"        0
2014 1703800 "BURITI DO TOCANTINS"        0
2018 1703800 "BURITI DO TOCANTINS"        0
2002 1703826 "CACHOEIRINHA"               0
2006 1703826 "CACHOEIRINHA"               0
2010 1703826 "CACHOEIRINHA"               0
2014 1703826 "CACHOEIRINHA"               0
2018 1703826 "CACHOEIRINHA"               0
2002 1703842 "CAMPOS LINDOS"              0
2006 1703842 "CAMPOS LINDOS"              0
2010 1703842 "CAMPOS LINDOS"              0
2014 1703842 "CAMPOS LINDOS"              0
2018 1703842 "CAMPOS LINDOS"              0
2002 1703867 "CARIRI DO TOCANTINS"        0
2006 1703867 "CARIRI DO TOCANTINS"        0
2010 1703867 "CARIRI DO TOCANTINS"        0
2014 1703867 "CARIRI DO TOCANTINS"        0
2018 1703867 "CARIRI DO TOCANTINS"        0
2002 1703883 "CARMOLANDIA"                0
2006 1703883 "CARMOL�NDIA"                0
2010 1703883 "CARMOL�NDIA"                0
2014 1703883 "CARMOL�NDIA"                0
2018 1703883 "CARMOL�NDIA"                0
2002 1703891 "CARRASCO BONITO"            0
2006 1703891 "CARRASCO BONITO"            0
2010 1703891 "CARRASCO BONITO"            0
2014 1703891 "CARRASCO BONITO"            0
2018 1703891 "CARRASCO BONITO"            0
2002 1703909 "CASEARA"                    0
2006 1703909 "CASEARA"                    0
2010 1703909 "CASEARA"                    0
2014 1703909 "CASEARA"                    0
2018 1703909 "CASEARA"                    0
2002 1704105 "CENTENARIO"                 0
2006 1704105 "CENTEN�RIO"                 0
2010 1704105 "CENTEN�RIO"                 0
2014 1704105 "CENTEN�RIO"                 0
2018 1704105 "CENTEN�RIO"                 0
2002 1704600 "CHAPADA DE AREIA"           0
2006 1704600 "CHAPADA DE AREIA"           0
2010 1704600 "CHAPADA DE AREIA"           0
2014 1704600 "CHAPADA DE AREIA"           0
2018 1704600 "CHAPADA DE AREIA"           0
2002 1705102 "CHAPADA DA NATIVIDADE"      0
2006 1705102 "CHAPADA DA NATIVIDADE"      0
2010 1705102 "CHAPADA DA NATIVIDADE"      0
2014 1705102 "CHAPADA DA NATIVIDADE"      0
2018 1705102 "CHAPADA DA NATIVIDADE"      0
2002 1705508 "COLINAS DO TOCANTINS"       1
2006 1705508 "COLINAS DO TOCANTINS"       1
2010 1705508 "COLINAS DO TOCANTINS"       1
2014 1705508 "COLINAS DO TOCANTINS"       1
2018 1705508 "COLINAS DO TOCANTINS"       1
2002 1705557 "COMBINADO"                  0
2006 1705557 "COMBINADO"                  0
2010 1705557 "COMBINADO"                  0
2014 1705557 "COMBINADO"                  0
2018 1705557 "COMBINADO"                  0
2002 1705607 "CONCEICAO DO TOCANTINS"     0
2006 1705607 "CONCEI��O DO TOCANTINS"     0
2010 1705607 "CONCEI��O DO TOCANTINS"     0
2014 1705607 "CONCEI��O DO TOCANTINS"     0
2018 1705607 "CONCEI��O DO TOCANTINS"     0
2002 1706001 "COUTO DE MAGALHAES"         0
2006 1706001 "COUTO DE MAGALH�ES"         0
2010 1706001 "COUTO DE MAGALH�ES"         0
2014 1706001 "COUTO MAGALH�ES"            0
2018 1706001 "COUTO MAGALH�ES"            0
2002 1706100 "CRISTALANDIA"               0
2006 1706100 "CRISTAL�NDIA"               0
2010 1706100 "CRISTAL�NDIA"               0
2014 1706100 "CRISTAL�NDIA"               0
2018 1706100 "CRISTAL�NDIA"               0
2002 1706258 "CRIXAS DO TOCANTINS"        0
2006 1706258 "CRIX�S DO TOCANTINS"        0
2010 1706258 "CRIX�S DO TOCANTINS"        0
2014 1706258 "CRIX�S DO TOCANTINS"        0
2018 1706258 "CRIX�S DO TOCANTINS"        0
2002 1706506 "DARCINOPOLIS"               0
2006 1706506 "DARCIN�POLIS"               0
2010 1706506 "DARCIN�POLIS"               0
2014 1706506 "DARCIN�POLIS"               0
2018 1706506 "DARCIN�POLIS"               0
2002 1707009 "DIANOPOLIS"                 0
2006 1707009 "DIAN�POLIS"                 0
2010 1707009 "DIAN�POLIS"                 1
2014 1707009 "DIAN�POLIS"                 1
2018 1707009 "DIAN�POLIS"                 0
2002 1707108 "DIVINOPOLIS DO TOCANTINS"   0
2006 1707108 "DIVIN�POLIS DO TOCANTINS"   0
2010 1707108 "DIVIN�POLIS DO TOCANTINS"   0
2014 1707108 "DIVIN�POLIS DO TOCANTINS"   0
2018 1707108 "DIVIN�POLIS DO TOCANTINS"   0
2002 1707207 "DOIS IRMAOS DO TOCANTINS"   0
2006 1707207 "DOIS IRM�OS DO TOCANTINS"   0
2010 1707207 "DOIS IRM�OS DO TOCANTINS"   0
2014 1707207 "DOIS IRM�OS DO TOCANTINS"   0
2018 1707207 "DOIS IRM�OS DO TOCANTINS"   0
2002 1707306 "DUERE"                      0
2006 1707306 "DUER�"                      0
2010 1707306 "DUER�"                      0
2014 1707306 "DUER�"                      0
2018 1707306 "DUER�"                      0
2002 1707405 "ESPERANTINA"                0
2006 1707405 "ESPERANTINA"                0
2010 1707405 "ESPERANTINA"                0
2014 1707405 "ESPERANTINA"                0
2018 1707405 "ESPERANTINA"                0
2002 1707553 "FATIMA"                     0
2006 1707553 "F�TIMA"                     0
2010 1707553 "F�TIMA"                     0
2014 1707553 "F�TIMA"                     0
2018 1707553 "F�TIMA"                     0
2002 1707652 "FIGUEIROPOLIS"              0
2006 1707652 "FIGUEIR�POLIS"              0
2010 1707652 "FIGUEIR�POLIS"              0
2014 1707652 "FIGUEIR�POLIS"              0
2018 1707652 "FIGUEIR�POLIS"              0
2002 1707702 "FILADELFIA"                 0
2006 1707702 "FILAD�LFIA"                 0
2010 1707702 "FILAD�LFIA"                 0
2014 1707702 "FILAD�LFIA"                 0
2018 1707702 "FILAD�LFIA"                 0
2002 1708205 "FORMOSO DO ARAGUAIA"        0
2006 1708205 "FORMOSO DO ARAGUAIA"        0
2010 1708205 "FORMOSO DO ARAGUAIA"        0
2014 1708205 "FORMOSO DO ARAGUAIA"        0
2018 1708205 "FORMOSO DO ARAGUAIA"        0
2002 1708254 "FORTALEZA DO TABOCAO"       0
2006 1708254 "FORTALEZA DO TABOC�O"       0
2010 1708254 "FORTALEZA DO TABOC�O"       0
2014 1708254 "FORTALEZA DO TABOC�O"       0
2018 1708254 "FORTALEZA DO TABOC�O"       0
2002 1708304 "GOIANORTE"                  0
2006 1708304 "GOIANORTE"                  0
2010 1708304 "GOIANORTE"                  0
2014 1708304 "GOIANORTE"                  0
2018 1708304 "GOIANORTE"                  0
2002 1709005 "GOIATINS"                   0
2006 1709005 "GOIATINS"                   0
2010 1709005 "GOIATINS"                   0
2014 1709005 "GOIATINS"                   0
2018 1709005 "GOIATINS"                   0
2002 1709302 "GUARAI"                     1
2006 1709302 "GUARA�"                     1
2010 1709302 "GUARA�"                     1
2014 1709302 "GUARA�"                     1
2018 1709302 "GUARA�"                     1
2002 1709500 "GURUPI"                     1
2006 1709500 "GURUPI"                     1
2010 1709500 "GURUPI"                     1
2014 1709500 "GURUPI"                     1
2018 1709500 "GURUPI"                     1
2002 1709807 "IPUEIRAS"                   0
2006 1709807 "IPUEIRAS"                   0
2010 1709807 "IPUEIRAS"                   0
2014 1709807 "IPUEIRAS"                   0
2018 1709807 "IPUEIRAS"                   0
2002 1710508 "ITACAJA"                    0
2006 1710508 "ITACAJ�"                    0
2010 1710508 "ITACAJ�"                    0
2014 1710508 "ITACAJ�"                    0
2018 1710508 "ITACAJ�"                    0
end

Weights in the rdbwselect command

I am trying to use the rdbwselect command and it has the the following syntax (from The Stata Journal (2017) 17, Number 2, pp. 372–404)

rdbwselect depvar runvar [if] [in] [, c(cutoff) p(pvalue) q(qvalue) deriv(dvalue) fuzzy(fuzzyvar [sharpbw] ) covs(covars) kernel(kernelfn) weights(weightsvar) bwselect(bwmethod) scaleregul(scaleregulvalue) vce(vcemethod) all ]

where they define : weights(weightsvar) : specifies the variable used for optional weighting of the estimation procedure. The unit-specific weights multiply the kernel function.

Can someone guide me towards what these weights are doing exactly? For instance, if I am using survey data that uses sampling weights, should I be using those weights in this syntax. My confusion stems from the fact that these weights could be representative of the fact that values of the running variable that are closer to the cutoff have a higher probability of receiving the treatment and it does not have to do something with sampling weights in particular. I highly appreciate your time and views!


Three-level ZINB in Stata

Hi,

I use Stata 16.1 and I need to run a 3-level Zero Inflated Negative Binomial regression. I have seen some previous answers suggest using "gllamm"

Based on the "gllamm" manual in Stata I wrote the following code:

meglm y x1, x2, x3....xn || lev3: || lev2:, family(nbinomial)

After 11 iterations the log likelihood stops changing.

Does anyone know if I am doing something wrong with the code and how I can get around this.

generating separate variables from one variables's observations, is there a loop?

Dear all,

From the variable v I would like to generate 3 variables: v1, v2, v3. Given that these are the 3 observations of v. I need to make separate calculations with them, therefore I need to cut this variable as many times as its observations. Of course I can write:

gen v1 = v[1]
gen v2 = v[2]
gen v3 = v[3]

although I have more that 3 and therefore I look for a proper loop, I have tried the following:

forvalues i = 1/_n {
gen v`i'= v[_n-`i']
}


but it has a syntax error that I cannot fix. Generally whenever I enter `i' inside brackets in such commands I get an error. Could anyone suggest a solution or a different loop that I can use?

Thanks is advance,
Shadi

Cleaning for Optimal Modeling (ft. Panel Data)

Hello Everyone,

Let's say I would like to do a linear regression on some Panel data without knowing if linear regression is the most appropriate technique to use. Before doing any analysis, is it better to have more variables, and extrapolate and interpolate missing observations for a few of them (say perhaps 25% of the variables need this out of a total of 30), or is it better to delete the years that contain these missing observations? (especially if the raw data has a mismatch of years with and without observations).

There are also control variables that allow for the testing of models for years and countries excluding the extrapolated observations if it matters. Please see the attached picture for an idea of what I'm talking about.

I'm new to cleaning panel data I just want some insight on what to look out for and what to prioritize for significant results that don't compromise truth. Thank you!

exporting many tables to excel

Hi , I have a question about how I can export many tables in a good format. I am trying to run regressions for 49 industries (later I need up to 100), and 12 time periods.
I am running my regressions using: [by industry yeargroup, sort : regress ind monday tuesday wednesday thursday friday, noconstant] .
this gives me 49x12 tables, I tried using the asdoc plug with the wide format, however Stata gives me an syntax error. I would like to know if there is another way to get these regression results in nice formatted tables, preferably all industries in 1 table for each time period (thus 12 tables for each time period, containing all the industries). Help is very appreciated

Mean of means, with means obtained from by year, sort + weigths

Hello,

I have obtained the mean, median, stdev by year (I have 18 years in the sample) for my expenditure variable a_rep_pos throught the following command (please note that weights have been applied):

by year, sort: sum a_rep_pos [aweight=weighta],d

I get an output for each of the years as expected. Now I want to a/some command/s that returns me 1) the mean of means , 2) the mean of medians and the the 3) the mean of the stddeviations, where means, medians and stddeviations are the results obtained from the previous command (and therefore weighted).

Could anyone help me with this?

Thank you in advance!

Best,
Linda



parsing a variable based on digits

Dear All,

I try to create a new variable (let's say caseid_copy) based on the original caseid. However, I have problems with its digits. The new variable should be constructed according to original caseid's digit numbers. That is, here is the original data: (I also attached how it looks like). Thank you many times!

caseid

101 8 2

1010 7 5

101010 2

101010 9

Here is the wanted data:

caseid_copy

101 8 (if caseid has three digits in the beginning, I want to drop the last digit which is 2 as above)

1010 7 (if caseid has four digits in the beginning, I want to drop the last digit of caseid which is 5 as above ))

10101 (if caseid has five digits in the beginning, I want to drop the last digit of caseid which is 2 as above . That is, I want to see only the first five digits)

101010 (if caseid has six digits in the beginning, I want to drop the last digit of caseid which is 9 as above. That is,I want to see only the first six digits ) Array



Replicating Excel Intercept/Slope

I have four variables, y1-y4 that I would like to regress on four fixed x's, x1-x4. (I have taken the logs of both the x's and y's). For each observation/row, I wanted to fit a line and capture the intercept and slope. In excel, it would be Intercept(y1-y4,x1-x4) and the same for Slope. I would then copy the command for each row. I cannot figure out how to do this in Stata. Any help is greatly appreciated. Thank you.

how to calculate pseudo-R2 using imputed data in a multinomial logistic regression

Dear all,

I would like to calculate the pseudo-R2 for my multinomial logistic regression in multiple imputed data.
This is my code for imputing data and running the regression(s):
mi set mlong
mi register regular BROWNISH_MULTI GREENISH_MULTI GREEN_MULTI
mi register imputed BERD_AUTO CAR_SALES Log_PATENT_STOCK GREEN_GOV_RD REN_EN_PUB_RD FOSS_FUEL_PUB_RD BEV_SALES EPS GDP_PC
mi impute mvn BERD_AUTO CAR_SALES Log_PATENT_STOCK GREEN_GOV_RD REN_EN_PUB_RD FOSS_FUEL_PUB_RD BEV_SALES EPS GDP_PC = BROWNISH_MULTI, add(20)

mi estimate: mlogit BROWNISH_MULTI l(1).(BERD_AUTO CAR_SALES Log_PATENT_STOCK GREEN_GOV_RD REN_EN_PUB_RD FOSS_FUEL_PUB_RD BEV_SALES EPS GDP_PC), base(0)

I have read in a Statalist post that a possible solution to get the pseudo R2 after mi estimate is the following:
local rhs "armg2 armg3 tbsaburn20 tbsaburn21" noi mi estimate, or saving(miest, replace): logistic hodc `rhs', vce(cluster site) qui mi query local M=r(M) scalar r2=0 scalar cstat=0 qui mi xeq 1/`M': logistic hodc `rhs'; scalar r2=r2+e(r2_p); lroc, nog; scalar cstat=cstat+r(area) scalar r2=r2/`M' scalar cstat=cstat/`M' noi di "Pseudo R=squared over imputed data = " r2 noi di "C statistic over imputed data = " cstat I don't understand exactly how to adapt this code to my needs (e.g. where to plug my variables).
Could you provide me any help, please? Many thanks in advance
Anna

Creating space between bar charts

Hi,

I am using Stata 17 and need help in bar charts. Below is an example of my dataset:

Code:
* Example generated by -dataex-. For more info, type help dataex
clear
input int date float(open high low close) double volume float change
14977 1320.28 1320.28 1276.05 1283.27 11294          .
14978 1283.27 1347.76 1274.62 1347.56 18807   64.29004
14979 1347.56 1350.24 1329.14 1333.34 21310 -14.220093
14980 1333.34 1334.77 1294.95 1298.35 14308  -34.98999
14983 1298.35 1298.35 1276.29 1295.86 11155   -2.48999
14984 1295.86 1311.72 1295.14  1300.8 11913   4.940063
14985  1300.8 1313.76 1287.28 1313.27 12965   12.46997
14986 1313.27 1332.19 1309.72 1326.82 14112  13.549927
14987 1326.82 1333.21 1311.59 1318.55 12760  -8.269897
14991 1318.32 1327.81 1313.33 1326.65 12057   8.099976
14992 1326.65 1346.92 1325.41 1329.47 13491   2.819946
14993 1329.89 1352.71 1327.41 1347.97 14450       18.5
14994 1347.97 1354.55 1336.74 1342.54 14078  -5.429932
14997 1342.54 1353.62 1333.84  1342.9 11640  .35998535
14998  1342.9  1362.9 1339.63  1360.4 12326       17.5
14999  1360.4 1369.75 1357.28  1364.3 13090  3.9000244
15000  1364.3 1367.35 1354.63 1357.51 12580  -6.790039
15001 1357.51 1357.51 1342.75 1354.95 10980 -2.5600586
15004 1354.92 1365.54 1350.36 1364.17 10531   9.220093
15005 1364.17 1375.68  1356.2 1373.73 11498   9.559937
15006 1373.73 1383.37 1364.66 1366.01 12953  -7.719971
15007 1366.01  1373.5 1359.34 1373.47 11188   7.459961
15008 1373.47 1376.38 1348.72 1349.47 10484        -24
15011 1349.47 1354.56 1344.48 1354.31 10130   4.840088
15012 1354.31 1363.55 1350.04 1352.26 10596 -2.0500488
15013 1352.26 1352.26 1334.26 1340.89 11583 -11.369995
15014  1341.1 1350.32 1332.42 1332.53 11072  -8.359985
15015 1332.53 1332.53 1309.98 1314.76 10755  -17.77002
15018 1314.76 1330.96 1313.64 1330.31 10391   15.55005
15019 1330.31 1336.62 1317.51  1318.8 10752  -11.51001
15020  1318.8 1320.73 1304.72 1315.92 11503  -2.880005
15021 1315.92 1331.29 1315.92 1326.61 11537   10.68994
15022 1326.61 1326.61 1293.18 1301.53 12572 -25.079956
15026 1301.53 1307.16 1278.44 1278.94 11122  -22.59009
15027 1278.94 1282.97 1253.16 1255.27 12085  -23.66992
15028 1255.27 1259.94 1228.33 1252.82 13659  -2.450073
15029 1252.82 1252.82 1215.44 1245.86 12313  -6.959961
15032 1245.86 1267.69 1241.71 1267.65 11308   21.79004
15033 1267.65 1272.76 1252.26 1257.94 11141  -9.710083
15034 1257.94 1263.47 1229.65 1239.94 12253        -18
15035 1239.94 1241.36  1214.5 1241.23 12949   1.290039
15036 1241.23 1251.01 1219.74 1234.18 12940  -7.049927
15039 1234.18 1242.55 1234.04 1241.41  9292    7.22998
15040 1241.41 1267.42 1241.41  1253.8 10918  12.390015
15041  1253.8 1263.86  1253.8 1261.89 11322   8.089966
15042 1261.89  1266.5  1257.6 1264.74 11141  2.8499756
15043 1264.74 1264.74 1228.42 1233.42 10859 -31.319946
15046 1233.42 1233.42 1176.78 1180.16 12290  -53.26001
15047 1180.16 1197.83  1171.5 1197.66 13609       17.5
15048 1197.66 1197.66 1155.35 1166.71 13974 -30.950073
end
format %td date
I have run this command to create the bar chart:
Code:
twoway bar change date in 1/20
The bar chart looks like this:

Array



As you can see, there is no spacing between the bars, I would like them to look like this:

Array

This example is done in excel and as you can see, the excel version has spacing between the different bars.

What I also need is to exclude the days that have no observations at all. So for example, there is no 7th Jan at all and therefore I don't want a large spacing between the bar graph of 6th Jan and 8th Jan. In other ways, I do not want Stata to include dates in between the dates if those dates eg 7th Jan has no data at all.

Is there a way to do such modifications to the bar charts? Thanks!






Multilevel model with olog?

I usally use the command:

mixed dep_var || country_var:

Problem is that my dependent variable is only scaled in two categories (0-1).
In that case i will normally use the olog command and interpret in probabilities.
Can i do that in a mulitilevel analysis?

Thanks!








Generating new ID variable taking into account duplicates across 2 other variables

First time poster, so I’m sorry for any errors…

I have two ID variables (ID1 and ID2). I want to create a new ID variable taking into account duplicates in both. Where there are duplicates in EITHER ID1 OR ID2, I want to treat this as the same person. Essentially, I want to generate a new variable which looks like NewID below. I have tried using something like:

by ID1 ID2, sort: gen NewID=1 if _n==1
replace NewID = sum(NewID)


but this only takes into account where there are duplicates across BOTH ID1 and ID2. I guess something like the below would be ideal, but Stata doesn't let me put in the | symbol into this

by ID1 | ID2, sort: gen NewId=1 if _n==1
replace NewID = sum(NewID)


I should also add that ID1 and ID2 are not ordered consistently, so I can’t just use _n-1

ID1 ID2 NewID
1 a 1
1 b 1
2 b 1
3 c 2
1 g 1
4 c 2
5 d 3
5 e 3
6 f 4

Any help would be very much appreciated!! Thank you!

Need help with Mean for Grouped Data

Array

Good morning Stata Family,

Please I want to create a new variable for the average wage earned by each worker for B and C shifts alone (as seen in the image above).

The code that I know: egen avgwage = mean(wage), by workerid gives me the average wage using all the shifts. I just want to find the average for just the B & C shifts, which I intend to use as a control for my analysis.

Any guidance you can provide will be most appreciated.

Thank you!

Specification of treatment and control group in diff-in-diff

Dear all,
I am doing a difference-in-differences estimation to see the effect of a parental leave reform on mothers' and fathers' wages. However, I am not sure that I specify the treatment and control groups as well as the time dummy correctly. What I want to specify in the end is

Code:
xtreg log(wage) i.treatment##i.post, robust
for the simple model. I also want to include fixed effects and control variables

Code:
xtreg log(wage) i.treatment##i.post, *controls*, *fixed effects*, fe robust
But I do not think I specify the treatment/control groups correctly.
The reform was implemented October 15th 1997. I have data running from 1994 to 2007. First, I created a time dummy equal to one if year is larger or equal to 1997:
Code:
gen post = (year>=1997)
I also created the treatment variable equal to 1 if the individuals had their first child after the reform and 0 otherwise. However, I do not think I specify it correctly as it is now dependent on the time.
I tried to respecify the treatment variable as

Code:
gen treatment = 1 if birthdate>tc(15oct1997) & birthdate<=(31dec1997)
replace treatment = 0 if if birthdate>=tc(01au1997) & birthdate<=(15oct1997)

replace treatment = 1 if birthdate>tc(15oct1996) & birthdate<=(31dec1996) 
replace treatment = 0 if if birthdate>=tc(01au1996) & birthdate<=(15oct1996)
as I understand that I need a 'treatment' group that is actually never treated as the year dummy only equal 1 if born in or after 1997

Do you have any ideas on how I could specify the control and treatment groups correctly as I don't think it is correct right now?

Best,
Kamilla

marginal effects

Dear everyone,

I have a regression on GDP where I include both inflation, inflation^2 and one lag of these (I also have other control variables). Since I am looking for a convex relationship between inflation and GDP I want to look at the marginal effect of inflation on GDP but I am confused about the margins command in stata even after reading about it. Also, I have a hard time understanding how I should interpret marginal effects if I include not lagged and lagged inflation and inflation^2.

I hope it is ok to ask these questions here, all help is greatly appreciated.

Endogeneity test for two endogenous variables using REIV (xtivreg, re)

Dear all,

I am using an REIV using the command "xtivreg, re". I have two endogenous variables, y2 and y3 (y3 is an interaction of y2 with z2, and z2 is assumed to be exogenous), and two instrumental variables.

Jeff Wooldridge explains this in Section 11.2 of his MIT press book (Econometric Analysis of Cross Section and Panel Data) for one endogenous variable. He also gives an example of how to do it in Stata in one of the Statalist posts in 2017.


Krissy: Your case is a bit harder, but doable in Stata. I cover it in Section 11.2 in my MIT Press book, 2010, 2e. I think you have to do it "by hand" as xtreg2 does not support RE and xtregress does not have the endog test available.

Let y1 be the dependent variable, y2 the endogenous explanatory variable, z1, ... zL the exogenous variables, with z1, ..., zM, M < L, included in the model.


Code:
reg y2 z1 z2 ... zL
predict v2hat, resid
xtreg y1 y2 v2hat z1 z2 ... zM, re vce(cluster id)
The t statistic on v2hat is the test of the null that y2 is exogenous. If you reject, you conclude IV is needed. The test is fully robust to serial correlation and heteroskedasticity.

A word of caution: You are requiring pretty strong exogeneity of your instrument. It must be uncorrelated with the heterogeneity in the structural equation, as well as the shocks. If your explanatory variable and instruments change over time, FEIV will be more convincing.

JW

My question is if I can apply the same procedure in the case of two endogenous explanatory variables (y2 and y3) and two instruments?

Code:
reg y2 z1 z2 ... zL
predict v2hat2, resid
reg y3 z1 z2 ... zL
predict v2hat3, resid
xtreg y1 y2 v2hat2 y3 v2hat3 z1 z2 ... zM, re vce(cluster id)
In this case, are the t-statistics on v2hat2 and v2hat3 the tests of the null that y2 and y3 are exogenous?
Are there any other tests that I might perform in this case?

Best regards,
Mehrzad

How to choose part of dataset based on date

Hi

I need some help with Stata syntax. I have a dataset with persons who have participated in a health survey, and later had a specific surgery, aprox. 1000 persons and 650 variables. The health surveys have been performed every 5-10 years from 1970 to 2018, a total of 7 times. Each survey has about 100 variables. The persons have participated in between 1 and 7 surveys. The dataset is organised like this:

ID date1 height1 weight1 bloodpressure1 ....... date2 height2 weight2 bloodpressure2 ........ date3 height3 weight3 bloodpressure3 ...[....] date7 height7 weight7 ...... date_of_surgery

For each person, I want to perform analyses on the variables from the survey closest to date of surgery (before surgery, not after). How do I transform my dataset to select only those variabels? Date of surgery is sometimes between surveys, sometimes after the last survey.

I am grateful for any tips.

Multiple observations per year in the dependent variable and annual observations for the dependent variables

Hello Statalisters,

I am trying to explore the impact of CSR expenses on the credit ratings of firms. Please find below a snapshot of the varialbes that I am working with. The dependent variable is the credit ratings of firms. Since every year, many companies have borrowed more than once, they have received multiple, and often different credit ratings. However, the explanatory variables, like their profits, assets, etc., are reported annually. I would like to incorporate a one-year lag in the model to allow for the time taken by the credit rating agencies to award ratings and I am using ordered logit regression, since the credit ratings follow an ordered pattern.

Even though the xtologit command is working, I am unable to incorporate the 1-year lag.

Kindly advise.

Thank you in advance.

Best regards,

Avik



Code:
* Example generated by -dataex-. For more info, type help dataex
clear
input long cocode str4 rating float(rating_score year) double csr_exp float lnsales byte pat_d float assets double(cap_int margin PSII) float aud_d byte busgrp_score
  11 "BBB"  13 2009                 .1   7.28283 1  7.077667 .9962875464056699    .06927358944402447               .681 0 0
  11 "BBB"  13 2011                2.4  7.867412 1  7.692159 .9884989274793483    .08468668607323425              .6154 0 0
  11 "BBB-" 12 2012                 .7  8.030475 1  7.931967 .9909152931882652    .10689183912534167              .5965 0 0
  11 "BBB-" 12 2013                 .3   8.08336 1  8.025386 .9931972789115646    .10700287028178143              .6734 0 0
  11 "D"     2 2015                 .2 8.2212105 0  8.160204 .9959701620509304    .06429147620328045              .5405 0 0
  11 "D"     2 2015                 .2 8.2212105 0  8.160204 .9959701620509304    .06429147620328045              .5405 0 0
  11 "BB-"   9 2015                 .2 8.2212105 0  8.160204 .9959701620509304    .06429147620328045              .5405 0 0
  11 "BB-"   9 2015                 .2 8.2212105 0  8.160204 .9959701620509304    .06429147620328045              .5405 0 0
  11 "BB-"   9 2015                 .2 8.2212105 0  8.160204 .9959701620509304    .06429147620328045              .5405 0 0
  11 "BB-"   9 2015                 .2 8.2212105 0  8.160204 .9959701620509304    .06429147620328045              .5405 0 0
  11 "D"     2 2016  .6000000000000001  8.229618 1  8.169874 .9939992640606867    .10089590443686006              .5101 0 0
  11 "D"     2 2016  .6000000000000001  8.229618 1  8.169874 .9939992640606867    .10089590443686006              .5101 0 0
  11 "BB"   10 2017                 .6   8.31801 1  8.268808 .9965387277901699    .11535269709543569              .4433 0 0
  11 "BB"   10 2017                 .6   8.31801 1  8.268808 .9965387277901699    .11535269709543569              .4433 0 0
  11 "BB"   10 2017                 .6   8.31801 1  8.268808 .9965387277901699    .11535269709543569              .4433 0 0
  11 "BB"   10 2017                 .6   8.31801 1  8.268808 .9965387277901699    .11535269709543569              .4433 0 0
  11 "BBB-" 12 2018                2.1   8.36969 1  8.310906 .9980580137659784    .11835060148807454              .4373 0 0
  11 "BBB-" 12 2019                4.8  8.475829 1  8.385649 .9986769167598147    .12579730687455706              .4414 0 0
 289 "BBB"  13 2012                1.3  7.856359 1  7.465369 .9988548525622674    .07749506215870802  .5854999999999999 0 1
 289 "BBB"  13 2014                1.4  7.402513 1  7.207415 .9986660738105825     .0564599719529297              .6381 0 1
 365 "A"    16 2009               22.1  9.556055 1 10.649092 .9983963676303486     .2802193913658882              .8336 0 1
 365 "A"    16 2010               30.2  9.802379 1  10.89443 .9995600960735327     .3171541265828387               .872 0 1
 365 "A"    16 2011               12.5  9.968264 1  11.26353 .9953071584761162     .2511013009410265              .8612 0 1
 365 "A"    16 2011               12.5  9.968264 1  11.26353 .9953071584761162     .2511013009410265              .8612 0 1
 365 "A"    16 2011               12.5  9.968264 1  11.26353 .9953071584761162     .2511013009410265              .8612 0 1
 365 "A"    16 2011               12.5  9.968264 1  11.26353 .9953071584761162     .2511013009410265              .8612 0 1
 365 "BBB-" 12 2013               44.4  9.945358 1 11.621166  .996872520900714     .2984915033181173              .7701 0 1
 365 "BB"   10 2013               44.4  9.945358 1 11.621166  .996872520900714     .2984915033181173              .7701 0 1
 365 "BB"   10 2013               44.4  9.945358 1 11.621166  .996872520900714     .2984915033181173              .7701 0 1
 771 "BBB"  13 2015                1.1  6.983882 1  7.113386 .9998371733289912    .16541562413122046              .4468 0 0
 771 "BBB"  13 2016                1.9  7.192107 1  7.495042 .9994997498749374     .1593799382948303              .4538 0 0
 771 "BBB-" 12 2017                2.9  7.272051 1  7.675732 .9900714484550431    .15109412990621743               .569 0 0
 771 "BBB-" 12 2017                2.9  7.272051 1  7.675732 .9900714484550431    .15109412990621743               .569 0 0
 771 "BBB-" 12 2018                2.6  7.249215 1  7.772121 .9923741310301243    .15415778251599147  .5599000000000001 1 0
 771 "BBB-" 12 2018                2.6  7.249215 1  7.772121 .9923741310301243    .15415778251599147  .5599000000000001 1 0
 771 "BBB-" 12 2019                2.4  7.475283 1  7.871731 .9941649822661225    .12727478882022789              .5764 0 0
1317 "A-"   15 2009                 .2  2.424803 1  7.701562 .9976489736865902     85.08849557522123              .6126 0 0
1317 "A"    16 2010                  0 2.1860514 1  8.003229 .9990302949240955     132.9887640449438              .5868 0 0
2015 "B"     7 2009                1.1  7.051336 1  7.892601  .999887950997236     .1463097713097713               .575 0 0
2015 "D"     2 2011                 .5  7.354938 1  7.903892 .9999261393012778    .11343436281092141               .501 0 0
2015 "B"     7 2011                 .5  7.354938 1  7.903892 .9999261393012778    .11343436281092141               .501 0 0
2015 "D"     2 2012                  0  7.463019 1  8.012582 .9999668731573194    .14095500459136823  .5395000000000001 0 0
2015 "D"     2 2014                  0  7.413187 0  8.041252                 1    .09464921276467395               .512 0 0
2384 "A"    16 2010                  0  7.374441 1  7.040361 .9945699772289367    .06421270458393429              .7129 1 0
2717 "AA"   19 2009                  0  8.759967 0  9.979332 .9817157105844947    -.5392146095796923              .8915 0 1
2717 "AA"   19 2010                  0  9.143057 0 10.278332 .9895962578148361   -.17724321606965227              .8725 0 1
2717 "AA"   19 2011                  0  9.572661 0 10.212482 .9858926401650447   -.02686787409686356              .9122 1 1
2717 "BBB-" 12 2011                  0  9.572661 0 10.212482 .9858926401650447   -.02686787409686356              .9122 1 1
2717 "AA"   19 2011                  0  9.572661 0 10.212482 .9858926401650447   -.02686787409686356              .9122 1 1
2717 "BBB"  13 2011                  0  9.572661 0 10.212482 .9858926401650447   -.02686787409686356              .9122 1 1
2717 "AA"   19 2011                  0  9.572661 0 10.212482 .9858926401650447   -.02686787409686356              .9122 1 1
2717 "AA"   19 2012                  0  9.882228 0 10.179546 .9983572661570271    .03275397996864035              .9237 1 1
2717 "BBB"  13 2012                  0  9.882228 0 10.179546 .9983572661570271    .03275397996864035              .9237 1 1
2717 "AA"   19 2012                  0  9.882228 0 10.179546 .9983572661570271    .03275397996864035              .9237 1 1
2717 "AA"   19 2012                  0  9.882228 0 10.179546 .9983572661570271    .03275397996864035              .9237 1 1
2717 "BBB"  13 2012                  0  9.882228 0 10.179546 .9983572661570271    .03275397996864035              .9237 1 1
2717 "AA"   19 2012                  0  9.882228 0 10.179546 .9983572661570271    .03275397996864035              .9237 1 1
2717 "AA"   19 2012                  0  9.882228 0 10.179546 .9983572661570271    .03275397996864035              .9237 1 1
2717 "BBB"  13 2013                  0  9.907056 0 10.359493 .9978862437886624    .03109882171237825              .8924 1 1
2717 "BBB"  13 2013                  0  9.907056 0 10.359493 .9978862437886624    .03109882171237825              .8924 1 1
2717 "AA"   19 2013                  0  9.907056 0 10.359493 .9978862437886624    .03109882171237825              .8924 1 1
2717 "AA"   19 2013                  0  9.907056 0 10.359493 .9978862437886624    .03109882171237825              .8924 1 1
2717 "BBB"  13 2013                  0  9.907056 0 10.359493 .9978862437886624    .03109882171237825              .8924 1 1
2717 "AA"   19 2013                  0  9.907056 0 10.359493 .9978862437886624    .03109882171237825              .8924 1 1
2717 "BBB"  13 2013                  0  9.907056 0 10.359493 .9978862437886624    .03109882171237825              .8924 1 1
2717 "AA"   19 2013                  0  9.907056 0 10.359493 .9978862437886624    .03109882171237825              .8924 1 1
2717 "A-"   15 2014                  0 10.082482 0 10.229534 .9972644859543264  -.010371943495691944              .8896 1 1
2717 "A-"   15 2014                  0 10.082482 0 10.229534 .9972644859543264  -.010371943495691944              .8896 1 1
2717 "A-"   15 2014                  0 10.082482 0 10.229534 .9972644859543264  -.010371943495691944              .8896 1 1
2717 "A-"   15 2014                  0 10.082482 0 10.229534 .9972644859543264  -.010371943495691944              .8896 1 1
2717 "A-"   15 2014                  0 10.082482 0 10.229534 .9972644859543264  -.010371943495691944              .8896 1 1
2717 "A-"   15 2014                  0 10.082482 0 10.229534 .9972644859543264  -.010371943495691944              .8896 1 1
2717 "AA"   19 2014                  0 10.082482 0 10.229534 .9972644859543264  -.010371943495691944              .8896 1 1
2717 "A-"   15 2015                  0 10.168529 1  10.36415 .9965113161422601    .07010820991419157              .8899 0 1
2717 "A-"   15 2015                  0 10.168529 1  10.36415 .9965113161422601    .07010820991419157              .8899 0 1
2717 "A"    16 2015                  0 10.168529 1  10.36415 .9965113161422601    .07010820991419157              .8899 0 1
2717 "A"    16 2015                  0 10.168529 1  10.36415 .9965113161422601    .07010820991419157              .8899 0 1
2717 "A"    16 2015                  0 10.168529 1  10.36415 .9965113161422601    .07010820991419157              .8899 0 1
2717 "A"    16 2016                1.1 10.328735 1 10.589537 .9979731652109545    .16282999013052543               .903 0 1
2717 "AAA"  22 2019               44.7  11.02941 0 11.893705 .5294381331465569    -.1485064375872711              .7384 0 1
2717 "BB"   10 2019               44.7  11.02941 0 11.893705 .5294381331465569    -.1485064375872711              .7384 0 1
2717 "D"     2 2019               44.7  11.02941 0 11.893705 .5294381331465569    -.1485064375872711              .7384 0 1
2842 "BBB"  13 2010                  0  6.535676 1  5.859076 .9563356164383562    .07819527056434064  .5982000000000001 0 0
2842 "BBB"  13 2011                  0  7.017954 1  6.491785 .6731847809610428    .09565606806986117              .6001 0 0
2842 "BBB"  13 2011                  0  7.017954 1  6.491785 .6731847809610428    .09565606806986117              .6001 0 0
2842 "BBB"  13 2012                  0  7.283242 1  6.770675 .6121114806743894    .12282750566737652              .6067 0 0
2842 "BBB"  13 2012                  0  7.283242 1  6.770675 .6121114806743894    .12282750566737652              .6067 0 0
2842 "BBB"  13 2014                  0   7.42028 1  7.066552 .5652248101697807    .09625636418089248              .6171 0 0
2842 "BBB"  13 2014                  0   7.42028 1  7.066552 .5652248101697807    .09625636418089248              .6171 0 0
2842 "BBB"  13 2014                  0   7.42028 1  7.066552 .5652248101697807    .09625636418089248              .6171 0 0
2842 "BBB"  13 2015                  0  7.401536 1  7.071573 .5880305602716469    .08751907232224596              .6188 0 0
2842 "BBB"  13 2016                  0  6.446513 1  6.804171  .991681455190772    .26217287866772404              .6188 0 0
3335 "A-"   15 2013                1.3  7.897668 1  7.733684                 1     .3018208844295801              .7437 0 1
3335 "A-"   15 2014                2.1  7.982143 1  7.778254                 1     .2485827470801175              .7436 0 1
3335 "A-"   15 2014                2.1  7.982143 1  7.778254                 1     .2485827470801175              .7436 0 1
3990 "AA"   19 2008                  0 8.1215105 1  8.336511 .9543069366239368    .10273016250259945              .5353 0 1
3990 "AA"   19 2010                3.9  8.525498 1  8.340814 .9718955028032923    .13088839082055656              .5003 0 1
3990 "A-"   15 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A-"   15 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A"    16 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A"    16 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A"    16 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A"    16 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A"    16 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A"    16 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A"    16 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A"    16 2015               11.3  9.375516 1  9.215437 .9986966600670573    .12166836215666328  .6224000000000001 0 1
3990 "A"    16 2016               17.6  9.415979 1  9.308274 .9936167704848172    .11664807743973427              .6575 0 1
3990 "A"    16 2016               17.6  9.415979 1  9.308274 .9936167704848172    .11664807743973427              .6575 0 1
3990 "A"    16 2016               17.6  9.415979 1  9.308274 .9936167704848172    .11664807743973427              .6575 0 1
3990 "A"    16 2016               17.6  9.415979 1  9.308274 .9936167704848172    .11664807743973427              .6575 0 1
3990 "A"    16 2016               17.6  9.415979 1  9.308274 .9936167704848172    .11664807743973427              .6575 0 1
3990 "A"    16 2016               17.6  9.415979 1  9.308274 .9936167704848172    .11664807743973427              .6575 0 1
3990 "A-"   15 2017               19.2  9.463493 1  9.385167 .9949036136485759    .12271753307247998  .6759999999999999 0 1
3990 "A"    16 2017               19.2  9.463493 1  9.385167 .9949036136485759    .12271753307247998  .6759999999999999 0 1
3990 "A-"   15 2017               19.2  9.463493 1  9.385167 .9949036136485759    .12271753307247998  .6759999999999999 0 1
3990 "A"    16 2018               20.7   9.44349 1   9.53976 .9967269954537608    .12900670853893248              .6782 0 1
3990 "A"    16 2018               20.7   9.44349 1   9.53976 .9967269954537608    .12900670853893248              .6782 0 1
3998 "A"    16 2008                2.8  9.171226 1  9.152414  .991395752977578    .10095253941183811              .5345 0 1
3998 "A"    16 2008                2.8  9.171226 1  9.152414  .991395752977578    .10095253941183811              .5345 0 1
3998 "A"    16 2009                7.6  9.648615 1  9.295572 .9836726447927859    .13558447255859565              .5642 0 1
3998 "AA-"  18 2015               36.9 10.348997 1  10.28826 .9999863878906675     .1262966082425694  .7243999999999999 0 1
3998 "AA-"  18 2016               53.2 10.311227 1 10.297828 .9998584338060071    .15246289281402675  .7050000000000001 0 1
3998 "AA-"  18 2016               53.2 10.311227 1 10.297828 .9998584338060071    .15246289281402675  .7050000000000001 0 1
3998 "AA-"  18 2016               53.2 10.311227 1 10.297828 .9998584338060071    .15246289281402675  .7050000000000001 0 1
3998 "AA-"  18 2016               53.2 10.311227 1 10.297828 .9998584338060071    .15246289281402675  .7050000000000001 0 1
3998 "AA-"  18 2016               53.2 10.311227 1 10.297828 .9998584338060071    .15246289281402675  .7050000000000001 0 1
3998 "AA-"  18 2017               89.2  10.36154 1 10.463384 .9993944552668111     .1684798765330584              .6993 0 1
3998 "AA-"  18 2017               89.2  10.36154 1 10.463384 .9993944552668111     .1684798765330584              .6993 0 1
3998 "AA-"  18 2017               89.2  10.36154 1 10.463384 .9993944552668111     .1684798765330584              .6993 0 1
3998 "AA-"  18 2017               89.2  10.36154 1 10.463384 .9993944552668111     .1684798765330584              .6993 0 1
3998 "AA-"  18 2017               89.2  10.36154 1 10.463384 .9993944552668111     .1684798765330584              .6993 0 1
3998 "AA-"  18 2018               80.2 10.546862 1     10.69 .9996083298227237     .1473153992085722  .6889000000000001 0 1
3998 "AA-"  18 2018               80.2 10.546862 1     10.69 .9996083298227237     .1473153992085722  .6889000000000001 0 1
3998 "AA-"  18 2018               80.2 10.546862 1     10.69 .9996083298227237     .1473153992085722  .6889000000000001 0 1
3998 "AA-"  18 2018               80.2 10.546862 1     10.69 .9996083298227237     .1473153992085722  .6889000000000001 0 1
3998 "AA-"  18 2019              106.2  10.75909 1 10.978138 .9997746648139108     .1714050127720706  .7346999999999999 0 1
3998 "AA-"  18 2019              106.2  10.75909 1 10.978138 .9997746648139108     .1714050127720706  .7346999999999999 0 1
3998 "AA-"  18 2019              106.2  10.75909 1 10.978138 .9997746648139108     .1714050127720706  .7346999999999999 0 1
3998 "AA-"  18 2019              106.2  10.75909 1 10.978138 .9997746648139108     .1714050127720706  .7346999999999999 0 1
3998 "AA-"  18 2019              106.2  10.75909 1 10.978138 .9997746648139108     .1714050127720706  .7346999999999999 0 1
3998 "AA-"  18 2019              106.2  10.75909 1 10.978138 .9997746648139108     .1714050127720706  .7346999999999999 0 1
3998 "AA-"  18 2019              106.2  10.75909 1 10.978138 .9997746648139108     .1714050127720706  .7346999999999999 0 1
4024 "A-"   15 2018                  0  9.020462 1  9.126492 .9992170253488043   .060261649698933564  .6882999999999999 0 0
4024 "A-"   15 2018                  0  9.020462 1  9.126492 .9992170253488043   .060261649698933564  .6882999999999999 0 0
4024 "A-"   15 2018                  0  9.020462 1  9.126492 .9992170253488043   .060261649698933564  .6882999999999999 0 0
4024 "A-"   15 2019                  0  8.924723 1  9.173573 .9994294013901857    .06875124742199455               .679 0 0
4024 "BBB"  13 2019                  0  8.924723 1  9.173573 .9994294013901857    .06875124742199455               .679 0 0
4024 "D"     2 2019                  0  8.924723 1  9.173573 .9994294013901857    .06875124742199455               .679 0 0
4030 "BBB"  13 2019                1.2  7.630655 1  7.139026 .9980161879066815   .057852844107940206              .7338 0 0
4253 "A"    16 2008                  0  9.927677 1 11.896345 .6981559251247534     .5212960658281967              .8226 0 1
4253 "BBB"  13 2009                  0 10.326138 1  12.19808 .7177981734607475     .5277621231979029              .7136 0 1
4253 "BB"   10 2010                  0  10.42214 1 12.062345  .713925900867486     .5239572202063979  .7696000000000001 0 1
4253 "BB"   10 2010                  0  10.42214 1 12.062345  .713925900867486     .5239572202063979  .7696000000000001 0 1
4253 "BB"   10 2010                  0  10.42214 1 12.062345  .713925900867486     .5239572202063979  .7696000000000001 0 1
4253 "C"     4 2012                  0  10.36196 1 12.068918 .6779403912779417     .4550802798334656              .6554 0 1
4253 "D"     2 2012                  0  10.36196 1 12.068918 .6779403912779417     .4550802798334656              .6554 0 1
4253 "D"     2 2013                  0 10.511404 1 12.117255 .6725680400169178     .4425568986170097              .6551 0 1
4253 "D"     2 2013                  0 10.511404 1 12.117255 .6725680400169178     .4425568986170097              .6551 0 1
4253 "D"     2 2014                  0 10.580647 1 12.218476 .6735596389263794     .4540436320035768              .6613 0 1
4253 "BB-"   9 2014                  0 10.580647 1 12.218476 .6735596389263794     .4540436320035768              .6613 0 1
4253 "D"     2 2016                 10 10.414786 0 12.237112 .9500037080966897     .3067369134399429              .5339 0 1
4253 "D"     2 2017                  0  9.774756 0 12.140345 .9460959188283031    .13693543067071576  .5307999999999999 0 1
4253 "D"     2 2018                1.7    9.5942 0 12.039883                 1    -.8421597612981457              .5104 0 1
4253 "D"     2 2019                2.9 9.0470915 0 11.826667 .9809000274746152    -4.803374225905955              .4869 0 1
4671 "A"    16 2017               15.2 10.792982 1 10.113408 .9915084550657436    .03893545338756728              .7115 0 1
4671 "A"    16 2018 23.099999999999998  11.04617 1  10.07854  .992533429585918     .0408079944628202  .6849000000000001 0 1
4671 "AA-"  18 2018 23.099999999999998  11.04617 1  10.07854  .992533429585918     .0408079944628202  .6849000000000001 0 1
4671 "AA-"  18 2019 26.900000000000002 11.297006 1 10.048293 .9930741511401047    .04011789525936957              .7029 0 1
4671 "AA-"  18 2019 26.900000000000002 11.297006 1 10.048293 .9930741511401047    .04011789525936957              .7029 0 1
4671 "AA-"  18 2019 26.900000000000002 11.297006 1 10.048293 .9930741511401047    .04011789525936957              .7029 0 1
4671 "AA-"  18 2019 26.900000000000002 11.297006 1 10.048293 .9930741511401047    .04011789525936957              .7029 0 1
4671 "AA-"  18 2019 26.900000000000002 11.297006 1 10.048293 .9930741511401047    .04011789525936957              .7029 0 1
4671 "AA-"  18 2019 26.900000000000002 11.297006 1 10.048293 .9930741511401047    .04011789525936957              .7029 0 1
4671 "AA-"  18 2019 26.900000000000002 11.297006 1 10.048293 .9930741511401047    .04011789525936957              .7029 0 1
4671 "AA-"  18 2019 26.900000000000002 11.297006 1 10.048293 .9930741511401047    .04011789525936957              .7029 0 1
4709 "BBB"  13 2009                  0  9.563101 0  10.09382 .9989542429628405    .10923478028657968              .7048 1 1
4709 "BBB"  13 2012               11.7 10.229625 0  10.44225 .9976894599112544    .06374110947925275  .5672999999999999 1 1
4709 "BBB"  13 2013                5.2  10.42755 1 10.439264 .9982443563371421    .11114203241802975  .5740999999999999 1 1
4709 "A-"   15 2014                6.1  10.57378 1 10.418832 .9986978972234749    .12410924846916069              .6233 1 1
4709 "A-"   15 2015               24.2 10.554912 1  10.76657 .9993608821935028    .10584015805908448              .6893 1 1
4709 "A-"   15 2015               24.2 10.554912 1  10.76657 .9993608821935028    .10584015805908448              .6893 1 1
4709 "A"    16 2016               34.3 10.524247 1  11.14619 .9943029645363871    .11581753872947954              .6906 1 1
4709 "A"    16 2016               34.3 10.524247 1  11.14619 .9943029645363871    .11581753872947954              .6906 1 1
4709 "A"    16 2016               34.3 10.524247 1  11.14619 .9943029645363871    .11581753872947954              .6906 1 1
4709 "A"    16 2016               34.3 10.524247 1  11.14619 .9943029645363871    .11581753872947954              .6906 1 1
4709 "AA-"  18 2017               54.7 10.753013 1 11.071162 .9942852876488174     .1238061435898971              .7147 1 1
4709 "AA-"  18 2018               62.2 10.732495 1  11.04983 .9920693687910356     .1141378715826184              .7138 1 1
4709 "AA-"  18 2018               62.2 10.732495 1  11.04983 .9920693687910356     .1141378715826184              .7138 1 1
4709 "AA-"  18 2019               93.4   10.8683 1 11.034452 .9929662804225717     .1426151735700949              .7374 1 1
4709 "AA-"  18 2019               93.4   10.8683 1 11.034452 .9929662804225717     .1426151735700949              .7374 1 1
5003 "BBB"  13 2010                  0  7.935301 1  7.722058 .9126063075832742   .042087180588361604  .7273000000000001 0 0
5003 "BBB"  13 2011                  0  8.331779 1  7.997966 .9667551850482369    .04810053445038278              .7262 0 0
5003 "BBB"  13 2012                  0  8.545334 1  8.149081 .9322871510317322   .050245979738269775              .6818 0 0
5003 "BB"   10 2012                  0  8.545334 1  8.149081 .9322871510317322   .050245979738269775              .6818 0 0
5003 "BB"   10 2013                  0  8.289639 1  8.190493 .9021766255372244    .06024005624748895              .6733 0 0
5003 "BB"   10 2014                  0  8.346689 1   8.24273 .9010763441143188    .07435605521559699  .7252000000000001 0 0
5417 "A-"   15 2010                  0  8.503682 1  9.638493 .9443633425007494     .1096717954955503              .5736 0 1
5417 "A-"   15 2011                  0  8.834992 1  9.799109 .9335098264484368    .08207554035368604              .6323 0 1
5417 "A-"   15 2011                  0  8.834992 1  9.799109 .9335098264484368    .08207554035368604              .6323 0 1
5417 "BBB"  13 2015                  0  8.898133 0 9.2553425 .9678881102836439   -.02434991733052758              .4709 0 1
5417 "D"     2 2015                  0  8.898133 0 9.2553425 .9678881102836439   -.02434991733052758              .4709 0 1
5417 "D"     2 2017                  0  7.892826 0  8.540441 .9455017977176803    -.6054144884241971  .7405999999999999 0 1
5417 "D"     2 2017                  0  7.892826 0  8.540441 .9455017977176803    -.6054144884241971  .7405999999999999 0 1
5417 "D"     2 2019                  0  6.781738 0  7.507141 .8475562877539814    -.4160617059891107                .74 0 1
5574 "BBB-" 12 2011                 .2  6.325792 1   6.41001 .9983549925974667    .15139584824624194              .4655 0 0
5574 "BBB-" 12 2011                 .2  6.325792 1   6.41001 .9983549925974667    .15139584824624194              .4655 0 0
5574 "BBB-" 12 2012                 .1  6.435509 1  6.429235 .9980635791512021    .07665169980756895              .4655 0 0
5574 "BBB-" 12 2012                 .1  6.435509 1  6.429235 .9980635791512021    .07665169980756895              .4655 0 0
5574 "BBB-" 12 2014                 .4  6.969509 1  6.888878 .9966371140324061     .1426153990786876              .4717 0 0
5574 "BBB"  13 2014                 .4  6.969509 1  6.888878 .9966371140324061     .1426153990786876              .4717 0 0
5574 "BBB"  13 2015                1.9  7.183795 1  7.300406 .8771692889459112     .1587373852340845              .4726 0 0
5574 "BBB"  13 2015                1.9  7.183795 1  7.300406 .8771692889459112     .1587373852340845              .4726 0 0
5574 "BBB"  13 2016                2.2  7.488237 1  7.602701  .892343781193851    .15014830152778555 .42469999999999997 0 0
5747 "AA-"  18 2010                  0 12.464047 1  12.62878 .9984761711301431     .0645515571466264              .8784 0 1
5747 "AA-"  18 2012              376.4 12.883516 1  13.88473 .9865011680362573    .11586507266343718              .9462 0 1
5747 "AA-"  18 2012              376.4 12.883516 1  13.88473 .9865011680362573    .11586507266343718              .9462 0 1
5747 "A"    16 2014              420.8 13.223075 1  13.98709 .9722133777276422    .14495812680047526              .9659 0 1
5747 "A"    16 2015              485.9 13.380417 1 14.087435 .9589225071535914    .15970901905875412              .9558 0 1
5747 "A"    16 2015              485.9 13.380417 1 14.087435 .9589225071535914    .15970901905875412              .9558 0 1
5747 "A"    16 2016               38.4  12.73705 1 12.942528 .9174407995711795    .07666255732183228               .888 0 1
5747 "A"    16 2017               52.9  12.81069 1 13.076218 .9294690812639439    .06610437725888146              .9588 0 1
5747 "A"    16 2018               72.2  12.79183 1 13.246348 .9404572017365385   .051178265241961673              .9671 0 1
5747 "AAA"  22 2019               72.5 12.908742 1 12.959246 .9234247331637416    .06308344910072877              .9676 0 1
5747 "AAA"  22 2019               72.5 12.908742 1 12.959246 .9234247331637416    .06308344910072877              .9676 0 1
5757 "BBB"  13 2010                1.5  8.377609 1 12.096706 .9999263723195619     .5527986018488709  .8422000000000001 1 1
5757 "BBB"  13 2010                1.5  8.377609 1 12.096706 .9999263723195619     .5527986018488709  .8422000000000001 1 1
5757 "BBB"  13 2010                1.5  8.377609 1 12.096706 .9999263723195619     .5527986018488709  .8422000000000001 1 1
5757 "BBB"  13 2010                1.5  8.377609 1 12.096706 .9999263723195619     .5527986018488709  .8422000000000001 1 1
5757 "BBB"  13 2010                1.5  8.377609 1 12.096706 .9999263723195619     .5527986018488709  .8422000000000001 1 1
5757 "A-"   15 2011               93.2  9.968896 1 12.773134 .9999741928823971     .5039832520759276              .8463 1 1
5757 "BBB"  13 2012               92.8 10.619413 0 13.149864 .9999692571715431    .24983871598373475  .8392999999999999 1 1

end



Thursday, April 29, 2021

Help needed: Export F-stats and P-values to an excel file

Hi,

I am using Stata 17 and need help in exporting results from Stata to excel. Here is an example of my data:

Code:
* Example generated by -dataex-. For more info, type help dataex
clear
input byte(korea uk germany Four_WD) double engine_sizeL byte(turbo_premium full_manual full_airbags tire_pressure_monitor parking_aid transmission_warranty) int(person_1 person_2)
0 0 1 1   2 0 0 0 1 1  4  40  45
0 0 0 0 5.3 0 0 0 1 1  5  25  30
0 0 0 1   2 0 0 0 1 1  4  30 100
0 0 0 1   2 0 0 0 1 0  6  50 110
0 0 0 0 2.4 0 0 1 1 0  5 100  60
0 0 0 1 5.6 0 0 0 1 1  5  26  45
1 0 0 0   2 0 0 0 1 0 10  25  40
0 0 0 0 3.6 0 0 0 1 1  6  30  40
0 0 0 0 2.5 0 0 0 1 0  5  30  50
0 0 0 0 6.2 0 0 1 1 1  6  20  50
0 1 0 1   2 0 0 0 1 0  5 120  90
0 0 1 1   2 0 0 0 1 0  4  80  85
0 0 1 1   3 0 0 0 1 1  4 100  55
0 1 0 1   2 0 0 0 1 1  4 120  70
0 0 0 1 6.2 0 0 0 1 1  6  30  55
0 0 0 0 3.6 0 0 0 1 0  5  45  95
1 0 0 1   2 0 0 0 1 0 10  30  45
0 0 0 1 5.3 0 0 0 1 1  5  20  40
0 0 0 0 2.5 0 1 0 1 0  5  30  45
0 0 0 0 6.2 0 1 0 0 0  5  40  55
0 0 0 0 1.5 0 0 0 1 0 10  30  50
0 0 1 1   2 0 0 0 1 0  4  75  60
0 0 0 0 2.4 0 0 0 1 0  5  20  50
0 0 0 0 3.5 0 0 0 1 0  5  25  45
0 0 1 0   2 0 0 0 1 0  4  60  70
0 0 0 0   2 0 0 0 1 1  5  40  60
0 0 0 1 2.4 0 0 0 1 0  5  80  50
0 0 0 0 3.5 0 0 0 1 1  5  70  50
0 0 0 0 5.7 0 1 0 1 0  5  90  60
0 1 0 1   3 0 0 0 1 1  5 120  90
0 0 0 1 1.5 0 0 0 1 0  5  25  50
0 0 0 0 3.5 1 0 0 1 0  5  45  55
0 0 0 0 6.2 0 1 0 0 1  5  40  50
1 0 0 1   5 0 0 1 1 1 10  50  60
0 0 0 1   2 0 0 0 1 0  6  40  60
0 0 0 0 3.5 0 0 0 1 1  6  65  75
0 0 1 0   3 0 0 0 1 0  3  90  90
0 0 0 0 2.4 0 0 0 1 0  4  80  40
0 0 0 1   2 0 0 0 1 0  5 100  45
0 0 0 0 3.6 0 0 0 1 0  5  90  55
end
This is not the full variables list as I have to limit the number of variables to use dataex but this is my code for joint hypothesis:

Code:
foreach y of varlist person_* {
    reg `y' korea1 korea2 uk1 uk2 germany1 germany2 Four_WD1 Four_WD2 engine_sizeL1 engine_sizeL2 turbo_premium1 turbo_premium2 full_manual1 full_manual2 full_airbags1 full_airbags2 tire_pressure_monitor1 tire_pressure_monitor2 parking_aid1 parking_aid2 transmission_warranty1 transmission_warranty2 g1 g2, robust
    test _b[korea1] = _b[korea2], notest
    test _b[uk1] = _b[uk2],accum notest
    test _b[germany1] = _b[germany2],accum notest
    test _b[Four_WD1] = _b[Four_WD2],accum notest
    test _b[engine_sizeL1] = _b[engine_sizeL2],accum notest
    test _b[turbo_premium1] = _b[turbo_premium2],accum notest
    test _b[full_manual1] = _b[full_manual2],accum notest
    test _b[full_airbags1] = _b[full_airbags2],accum notest
    test _b[tire_pressure_monitor1] = _b[tire_pressure_monitor2],accum notest
    test _b[parking_aid1] = _b[parking_aid2],accum notest
    test _b[transmission_warranty1] = _b[transmission_warranty2] ,accum notest
    test _b[g1] = _b[g2] ,accum
    
}
Here is a sample of the F-stat and p-values results:

Code:
F( 12,    56) =    1.64
Prob > F =    0.1080

I want to export the F-stats and p-values to an excel file.

Any help in this area would be appreciated. Thanks.



Direct, indirect and total marginal effect in the mvprobit

Please how to calculate the direct, indirect and total marginal effect in the mvprobit ?

Predicted Probabilities with Logit and Fixed Effect

Hello, I am trying to compare 4 different groups (black women, white women, black men, and white women) and their probability for leadership. Leadership is a dichotomous DV measured as (0,1) (oldleadership) and the fixed effects controls for differences in state legislatures across states. It also includes an interaction variable. This is the code I have ran and the output: xtlogit oldleader newideology female##race seniorityterms majoritymember, i(statefe) fe


Array

Array Array
Of course, if I run margins after this, the output would be nonsensical. I have no idea what else I should try. Can you not predict probabilities with panel data, a dichotomous dv, and a fixed effect? Thank you so much for your attention to this question.

Sincerely,
Jatia Wrighten

Measuring environmental innovation patents

Dear Stata experts,

For my master thesis, I'm figuring out how to analyse in order to measure environmental innovation patents.
I'm using IPC classes from the NBER dataset. But I struggle to prepare my data so if anyone can help me would be a blessing!

I want to keep a certain amount of different IPC classes which I use as environmental patents. But this means I need to get rid of all the other observations(patent types /IPC classes).
So I tried to do this with drop if, egen, bysort egen, but it all didn't work sadly. The tricky part is the IPC classes are string variables. I tried to use destring, but due to non-numeric observations (B01C example of a class) i can't use Destring or generate. I tried to fix this with force but this only gerenates dots, so I'm kinda lost how to proceed.

I need to sort and keep only a few specified IPC classes so I can merge that prepared data set with other data sets.

Thanks in advance and sorry if this is kind of a beginner/easy question.

Plotting regression results in scatter plot

Hello,

I have a regression of the form:

Code:
reg yvar xvar zvar i.year
where year is a categorical variable from 1990 to 2020. I want to store the estimates for each year and plot them in a scatter plot with a fitted regression line. Something along the lines of saving the estimates as "yearcoefficients" and the plotting
Code:
scatter year yearcoefficients || lfit year yearcoefficients
Is this possible at all?

Thank you very much!

Joan





margins & marginsplot in LCA by group

Good morning colleagues,

I'm having some trouble working out how to use margins and marginsplot to present bar charts for my Latent Class Analysis by Group.

My group LCA syntax is as follows:
Code:
gsem (Var1 Var2 Var3 Var4 Var5 Var6 <- _cons), family(bernoulli) link(logit) lclass(C 3) group(Var7) ginvariant(coef)
estat lcmean
estat lcprob
Previously, when I wasn't looking at this analysis by group, and just included Vars1-7 in a standard LCA, I used the below syntax to get the class probabilities charts and the individual class makeup, which was successful:

Code:
///Class probabilities chart
margins, predict(classpr class(1)) predict(classpr class(2))
marginsplot, recast(bar) xtitle("Latent Classes") ytitle("Probabilities of Belonging to a Class") xlabel(1 "Class 1" 2 "Class 2") title("Predicted Migraine Latent Class Probabilities with 95%CI")

///Class breakdown for class 1
margins, predict(outcome (Var1) class(1)) predict(outcome (Var2) class(1)) predict(outcome (Var3) class(1)) predict(outcome (Var4) class(1)) predict(outcome (Var5) class(1)) predict(outcome (Var6) class(1)) predict(outcome (Var7) class(1))
marginsplot, recast(bar) xtitle("") ytitle("") xlabel(1 "Var1" 2 "Var2" 3 "Var3" 4 "Var4" 5 "Var5" 6 "Var6" 7 "Var7") title ("Predicted Probability of Characteristics of Class 1")
However, I'm struggling to adapt the above code for margins and marginsplot to the group LCA, with a view to giving me the class probabilities for the classes within each binary group of Var7, and then the makeup of each of the classes within each group.

Any guidance would be greatly appreciated!

Kind regards,
Daniel Sullivan

Create a variable that summarizes the number of distinct combinations of string variables

Hi all,

Apologies in advance if my problem is unclear. I'm a former Stata user who has since been using Excel and previously SAS so I'm trying to remember what I used to know in Stata. I'm trying to count the number of times a distinct combination appears for two string variables. What I would like to see is something like this where the total number of times a country appears across the combination of country and area:


Country Number of Actions
Afghanistan 2
Albania 1
Algeria 2
Angola 2
Armenia 3
Azerbaijan 1
Bangladesh 2
Belize 2
Benin 5
Bhutan 4
Bolivia 6
Bosnia and Herzegovina 1

This is what my data looks like:
Code:
* Example generated by -dataex-. To install: ssc install dataex
clear
input str47 country str19 area
"Afghanistan" "Social Programs"    
"Afghanistan" "Social Programs"    
"Albania"     "Enforcement"        
"Algeria"     "Enforcement"        
"Algeria"     "Legal Framework"    
"Angola"      "Enforcement"        
"Angola"      "Social Programs"    
"Armenia"     "Coordination"      
"Armenia"     "Social Programs"    
"Armenia"     "Social Programs"    
"Azerbaijan"  "Social Programs"    
"Bangladesh"  "Enforcement"        
"Bangladesh"  "Legal Framework"    
"Belize"      "Enforcement"        
"Belize"      "Legal Framework"    
"Benin"       "Enforcement"        
"Benin"       "Enforcement"        
"Benin"       "Government Policies"
"Benin"       "Social Programs"    
"Benin"       "Social Programs"    
end
I've tried egen distinct2 = rowvals(country area) after installing egenmore using ssc install egenmore, but I keep getting:

unknown egen function rowvals()
r(133);

Any help would be appreciated and please let me know if I can clarify!

Thanks!

Intersection among datasets

Hi all,

so I have k databases (55) called name_country_mole.dta. Each database is an unbalanced panel of molecules observed for a certain number of quarters (the number of quarters and the quarters themselves might vary from a database to another). My objective is to find the matching molecules and quarters (i.e. intersection) across those datasets. When I had only 3 or 4 countries what I did was simply a series reclink of the type:
Code:
reclink Molecule quarter using "france_mole.dta", gen(myscore) idm(id_mas) idu(id_us) minscore(1)
so country1 reclinked with country2 obtaining the data countr1_2.dta. Then country_1_2.dta reclinked with country3 to obtain country_1_2_3.dta and so on. Now that I have 55 databases have to find a smarter way to perform such a task.
Can anyone please help me? I was thinking about something involving tempfiles but I am new to these.

Thank you,

Federico

enquiry about using likelihood ratio test when employing conditional logistics regression !

Hello,
I am testing a categorical variable (ethnicity) with a binary outcome (death). I want to obtain a single P value across all categories so I employed the likelihood ratio test for that purpose. but when comparing the null hypothesis model, I don't get any odds ratio (empty space). Yet, I end up with a P value.

I use the following commands:

clogit death i.ethnicity, group(matchedid) or
store est Danah
clogit death if ethnicity !=. , group(matchedid) or
lrtest Danah
it is after this command that I get no odds ratio:
clogit death if ethnicity !=. , group(matchedid) or
Nevertheless, I end up with a single P value after
lrtest Danah
When the same commands are run for "logistic" (unconditional logistic regression) I do get odds ratio testing the null hypothesis model, which made me suspect that I am doing something incorrect.

Can anyone thankfully help me on the correct way to run a likelihood ratio test for variables tested with conditional logistic regression.

xtlogit with one dummy predictor

Hello all,

I have been looking for days for an answer to this question, but have not found an answer so far.

My problem is as follows: I am working on a project where I am looking to see if sexual minorities (gay men/lesbian women) are more likely to enter male dominated industries. I have computed dummy variables (1/0) for the sexual minorities, and created dummies for male dominated industries (i.e. a dummy for industries that are 0-20% male, a dummy for industries that are 20-40% male etc.). I have an (unbalanced) panel data set, so I am using the xtlogit command (with odds ratio) to see whether sexual minorities are more likely to enter certain industries. However, when I use the command xtlogit (with 0-20% male dominated as dependent variable) and sexual minority as independent variable, the odds ratio I get is 2.82e-07 - which seems unlikely, as a substantial amount of sexual minorities are working in this industry.

What am I missing?

Kind Regards,

Bob Schuitemaker

Forecasting

I have been using STATA to do some forecasting on stock market data. When I generate the forecasts and look on a graph they are miles off, even though the previous predictions (within sample) are quite close to the real data.

Here is a small bit of the code:

arch dSP, ar(1/3) ma(2/3) arch(1/3) garch(1/3) tarch(1)
set obs `=_N+2'
replace t=_n
predict pSP, dynamic(`N+2')

Lots of the models I have used do similar things, anyone know if this is a viable way to make predictions or am I missing something?

Thanks

Tesla McKinsey 7S Model

Tesla McKinsey 7S ModelTesla McKinsey 7S model illustrates the ways in which seven elements of businesses can be aligned to increase effectiveness. According to the framework strategy, structure and systems are considered as hard elements, whereas shared values, skills, style and staff represent soft elements. McKinsey 7S model stresses the presence of strong links between elements. Specifically, according to this framework, a change in one element causes changes in others. As it is illustrated in figure below, shared values are positioned at the core of Tesla McKinsey 7S model, since shared values guide employee behaviour with implications on their performance. Figure 11 McKinsey 7S model   Hard Elements Strategy Tesla business strategy is based on the focus on electric cars driven by company’s mission to accelerate the world’s transition to sustainable energy. The alternative fuel vehicles manufacturer pursues product differentiation business strategy. Tesla cars and energy products are differentiated on the basis of performance, design and environmental sustainability. Moreover, ownership of distribution via company-operated stores and galleries in shopping centres and other places is placed at the core of Tesla business strategy. Moreover, the alternative fuel vehicles manufacturer positions low costs of Tesla electric vehicles ownership as one of the solid bases of competitive advantage.   Structure It is difficult to classify Tesla organizational structure into a single category due to its unique nature. Inability or unwillingness of CEO Elon Musk to delegate key tasks has certain implications on organizational structure of the company. Specifically, Musk has more people directly reporting to him than any other auto company and turnover amongst Tesla senior executive team is high. Moreover, there is no organizational chart or public list of senior leaders at Tesla.  Nevertheless, Tesla organizational structure is closer to divisional structure compared to other known structures.  The operations of the electric automaker are divided…

No observations r(2000)

Dear everyone,

I am working on my masters thesis and I am trying to make a twostep system gmm regression on gdp with a five-year average. I have calculated the average per five years in excel. My average for 1995-1999 is called 1995. I have 1995, 2000, 2005 2010 and 2015.
I specify my data as; xtset CountryID Year, and here delta is 1. When I try to run my regression stata tells me that there are no observations and I suspect it is due to the fact that four years are missing per each five-year average.

Please, does anyone know how I can solve this problem, perhaps I need to change delta to 5? But I do not know how to do so.

Any help is greatly appreciated!

Generalise value of dummy variable to all observations of a given panel ID

Hi,

I need to set the value of a dummy variable = 1 for all observations of a given panelID, if it already equals 1 for any observations of that panelID

Considering the data example below, if for a given value of panelID the variable Rev==1 at any observation, I need to set Rev=1 for all observations with that panelID.

In this example, the result should be that the value of Rev == 1 for all observations where panelID==3 (i.e. years 1989 to 1994), while the value of Rev should remain 0 for all observations with panelIDs == 1 and 2.

Any help is greatly appreciated.

John

Code:
* Example generated by -dataex-. To install: ssc install dataex
clear
input long panelID float(DataYear Rev)
1 1989 0
1 1990 0
1 1991 0
1 1992 0
1 1993 0
1 1994 0
2 1989 0
2 1990 0
2 1991 0
2 1992 0
2 1993 0
2 1994 0
3 1989 0
3 1990 0
3 1991 0
3 1992 0
3 1993 0
3 1994 1
end
format %ty DataYear
label values panelID firmID
label def firmID 1 "001004", modify
label def firmID 2 "001009", modify
label def firmID 3 "001011", modify

Creating a categorical variable with multiple dummy variables

Hello everyone,

I am looking for a way to create one categorical variable out of multiple dummy variables. More specifically, I have ten dummy variables indicating whether a person remembered a certain word named co07_1 co07_2 co07_3 co07_4 co07_5 co07_6 co07_7 co07_8 co07_9 co07_10. They are =1 if the respondent remembered the word and =3 otherwise.

I want to create a categorical variable that tells me how many words a person remembered ranging from 0 to 10.

If I were to do this by hand, this would require a lot of coding. Here an example of what this would look like for the values 9 and 10.

gen co07count = 10 if co07_1==1 & co07_2==1 & co07_3==1 & co07_4==1 & co07_5==1 & co07_6==1 & co07_7==1 & co07_8==1 & co07_9==1 & co07_10==1
replace co07count = 9 if co07_1==3 & co07_2==1 & co07_3==1 & co07_4==1 & co07_5==1 & co07_6==1 & co07_7==1 & co07_8==1 & co07_9==1 & co07_10==1 | co07_1==1 & co07_2==3 & co07_3==1 & co07_4==1 & co07_5==1 & co07_6==1 & co07_7==1 & co07_8==1 & co07_9==1 & co07_10==1 | co07_1==1 & co07_2==1 & co07_3==3 & co07_4==1 & co07_5==1 & co07_6==1 & co07_7==1 & co07_8==1 & co07_9==1 & co07_10==1 | co07_1==1 & co07_2==1 & co07_3==1 & co07_4==3 & co07_5==1 & co07_6==1 & co07_7==1 & co07_8==1 & co07_9==1 & co07_10==1 | co07_1==1 & co07_2==1 & co07_3==1 & co07_4==1 & co07_5==3 & co07_6==1 & co07_7==1 & co07_8==1 & co07_9==1 & co07_10==1 | co07_1==1 & co07_2==1 & co07_3==1 & co07_4==1 & co07_5==1 & co07_6==3 & co07_7==1 & co07_8==1 & co07_9==1 & co07_10==1 | co07_1==1 & co07_2==1 & co07_3==1 & co07_4==1 & co07_5==1 & co07_6==1 & co07_7==3 & co07_8==1 & co07_9==1 & co07_10==1 | co07_1==1 & co07_2==1 & co07_3==1 & co07_4==1 & co07_5==1 & co07_6==1 & co07_7==1 & co07_8==3 & co07_9==1 & co07_10==1 co07_1==1 & co07_2==1 & co07_3==1 & co07_4==1 & co07_5==1 & co07_6==1 & co07_7==1 & co07_8==1 & co07_9==3 & co07_10==1 | co07_1==1 & co07_2==1 & co07_3==1 & co07_4==1 & co07_5==1 & co07_6==1 & co07_7==1 & co07_8==1 & co07_9==1 & co07_10==3

Is there any way that one can use a loop for this that would (i) speed up the process and (ii) reduce the possibility that any errors occur?

Somebody's help with this would be very much appreciated!

Help needed in loops

Hi all,

I am using Stata 17 and would need some help in my loops. Below is an example of my dataset:

Code:
* Example generated by -dataex-. For more info, type help dataex
clear
input byte(korea uk germany Four_WD) double engine_sizeL byte(turbo_premium full_manual full_airbags tire_pressure_monitor parking_aid transmission_warranty) int(person_1 person_2)
0 0 0 0 3.5 0 0 0 1 0  6 .  30
0 0 1 1   3 0 0 0 1 1  4 .  45
0 0 0 1 3.5 0 0 0 1 1  5 .  40
0 1 0 1   2 0 0 0 1 0  4 .  60
0 0 0 0 2.7 1 0 0 1 0  5 .  50
1 0 0 0 1.6 0 0 0 1 0 10 .  50
0 0 0 1 4.6 0 0 1 1 0  6 .  90
0 0 0 0 3.6 0 0 1 1 1  6 .  50
0 0 0 1   2 0 0 0 1 0  5 .  45
0 0 0 1 3.6 0 1 0 1 0  5 .  50
0 1 0 1   2 0 0 0 1 0  5 . 100
0 0 0 1 5.7 0 0 0 1 1  5 .  40
0 0 0 1 3.5 1 0 0 1 1  5 .  40
0 0 0 0 6.2 0 1 0 0 1  5 .  45
0 0 0 1 2.4 0 0 0 1 0  4 .  40
1 0 0 0 1.6 0 0 0 1 0 10 .  40
0 0 0 1 5.7 0 0 1 1 1  6 .  90
0 0 1 1 2.9 1 0 1 1 1  4 .  90
0 0 1 1   4 1 0 1 1 1  4 .  60
0 0 0 0 2.5 0 0 0 1 0  5 .  55
0 0 0 0 2.4 0 0 0 1 0  5 .  45
0 0 0 1   2 0 0 0 1 0 10 .  40
0 0 1 1   3 1 0 0 1 0  4 .  70
0 0 0 1 3.5 1 0 0 1 0  5 .  55
0 0 0 0 1.4 0 1 0 1 1  4 .  40
0 0 1 1 3.6 0 0 1 1 1  4 .  75
0 0 0 0   2 0 0 0 1 1  6 .  55
0 1 0 1   3 0 0 0 1 1  5 .  80
1 0 0 1 2.4 0 0 0 1 0 10 .  60
0 0 1 1   4 1 0 1 1 1  4 .  50
0 0 0 1 2.4 0 1 0 1 0  5 .  50
1 0 0 0 3.3 1 0 1 1 1 10 .  75
0 0 0 0 1.6 0 1 0 1 0  5 .  45
0 0 0 0   2 0 0 0 1 0  6 .  65
1 0 0 0   2 0 0 0 1 1 10 .  40
0 0 0 1 4.2 0 0 0 0 1  6 .  65
1 0 0 0 3.3 0 0 0 1 0 10 .  45
0 1 0 1   3 0 0 0 1 1  5 .  85
0 0 0 1 2.5 0 0 0 1 1  5 .  50
0 0 0 1 3.5 1 0 0 1 0  5 .  50
end

I want to run a loop running regressions using this code:

Code:
foreach y of varlist person_* {
    reg `y' korea uk germany engine_sizeL turbo_premium full_manual Four_WD full_airbags tire_pressure_monitor parking_aid transmission_warranty  if  auto_id>40, robust
}
But as you can see, person_1 has no values at all. I need to know how do I make sure the loop ignores variables that have missing values? I got a "no observations" error message whenever I run the loop. What I want the loop to do is to ignore the missing values and continue to run on other variables.

Any help in this area would be appreciated. Thanks!