Hi
I have very large daily data.
The number of rows is 1 million and the number of columns is 50.
However, I anticipate that the number of columns will increase to approximately 5,000 by adding daily dummy variables, weekly dummy variables, and weekly x city dummy variables.
I would like to use this data to analyze fixed effects.
My computer spec has 4 core CPU and 8 gigabytes of RAM.
Results of two experiments in STAT 13 SE version
The computer stopped for 8 hours and 10 hours, respectively.
I do not know what to do, since this is the first time I have to turn around this big data.
Is there a good way?
I am willing to add RAM if necessary.
Thanks for reading
Thanks for any advice.
Related Posts with How much computer specs do I have to analyze large data for fixed effect
Regress only on fixed effects (residualize) without using dummiesI want to regress on only fixed effects to obtain dipersion of within-variation. However, I am using…
Regress only on fixed effects without using dummiesI want to regress on only fixed effects to obtain dipersion of within-variation. However, I am using…
Pseudo R2 for mprobit - Loglik with intercept-only modelI run both mlogit and mprobit regressions. Stata reports pseudo R2 for mlogit but not for mprobit. …
unreconnized commandHello! Please can anyone help me: I am using STATA 15 to run a Heckprobit, when I ask for mfx I get …
Pseudo R2 for mprogit - Loglik with intercept-only modelI run both mlogit and mprobit regressions. Stata reports pseudo R2 for mlogit but not for mprobit. …
Subscribe to:
Post Comments (Atom)
0 Response to How much computer specs do I have to analyze large data for fixed effect
Post a Comment