Hi
I have very large daily data.
The number of rows is 1 million and the number of columns is 50.
However, I anticipate that the number of columns will increase to approximately 5,000 by adding daily dummy variables, weekly dummy variables, and weekly x city dummy variables.
I would like to use this data to analyze fixed effects.
My computer spec has 4 core CPU and 8 gigabytes of RAM.
Results of two experiments in STAT 13 SE version
The computer stopped for 8 hours and 10 hours, respectively.
I do not know what to do, since this is the first time I have to turn around this big data.
Is there a good way?
I am willing to add RAM if necessary.
Thanks for reading
Thanks for any advice.
Related Posts with How much computer specs do I have to analyze large data for fixed effect
How to change a line pattern within a same graph? Hi, I would like to plot a graph using Code: twoway line and draw a different graph pattern after …
Panel data: xtoverid rejects RE for model with only time controls; odd result or information?Dear Statalisters, I am using interrupted time series methods on household panel data. I have month…
Loop for moving rangeGoodafternoon, I have a file which includes a significant amount of cusips (S&P1500) and dates …
Creating new outcome variable - invalid syntax r(198) problemI have twelve disease variables in my dataset. I want to to create an outcome variable "multimorbidi…
Confusing behavior by ustrregexs()I'm trying to use ustrregexs to extract substrings from a string. With the data below: Code: clear…
Subscribe to:
Post Comments (Atom)
0 Response to How much computer specs do I have to analyze large data for fixed effect
Post a Comment