forked from mikeckennedy/talk-python-transcripts
-
Notifications
You must be signed in to change notification settings - Fork 5
/
Copy path312-scaling.vtt
7249 lines (3625 loc) · 139 KB
/
312-scaling.vtt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
WEBVTT
00:00:00.000 --> 00:00:05.000
- Everyone out there in the live stream, welcome, welcome.
00:00:05.000 --> 00:00:07.000
Thank you for joining us.
00:00:07.000 --> 00:00:08.420
It's really great to have you here.
00:00:08.420 --> 00:00:11.120
So if you've got thoughts, comments you'd like to add
00:00:11.120 --> 00:00:13.600
to the show, please put them into the YouTube live chat
00:00:13.600 --> 00:00:18.480
and we'll see what we can do to make that part of the show.
00:00:18.480 --> 00:00:20.800
All right, Julian, you ready to kick this off?
00:00:20.800 --> 00:00:21.900
- Yeah, sure.
00:00:21.900 --> 00:00:26.240
- Julian, welcome to Talk Python to Me.
00:00:26.240 --> 00:00:28.640
- Thank you.
00:00:28.640 --> 00:00:30.640
- Yeah, it's great to have you here.
00:00:30.640 --> 00:00:34.560
And I think we've got a bunch of fun stuff to talk about.
00:00:34.560 --> 00:00:37.220
It's really interesting to think about
00:00:37.220 --> 00:00:39.880
how we go about building software at scale.
00:00:39.880 --> 00:00:41.260
And one of the things that just,
00:00:41.260 --> 00:00:43.140
I don't know how you feel about it, reading your book,
00:00:43.140 --> 00:00:46.340
I feel like you must have some opinions on this.
00:00:46.340 --> 00:00:47.660
But when I go to a website
00:00:47.660 --> 00:00:50.420
that is clearly not a small little company,
00:00:50.420 --> 00:00:53.620
it's obviously a large company with money to put behind,
00:00:53.620 --> 00:00:55.720
you know, professional developers and stuff.
00:00:55.720 --> 00:00:58.420
And you click on it and it takes four seconds
00:00:58.420 --> 00:00:59.900
for every page load.
00:00:59.900 --> 00:01:01.620
It's just like, how is it possible
00:01:01.620 --> 00:01:04.300
that you're building this software with so much,
00:01:04.300 --> 00:01:06.540
oh, this is the face of your business.
00:01:06.540 --> 00:01:08.140
And sometimes they decide to fix it
00:01:08.140 --> 00:01:09.140
with front-end frameworks.
00:01:09.140 --> 00:01:11.260
So then you get like a quick splash
00:01:11.260 --> 00:01:13.580
of like a box with a little UI,
00:01:13.580 --> 00:01:15.420
and then it's just loading for four seconds,
00:01:15.420 --> 00:01:17.300
which to me feels no better.
00:01:17.300 --> 00:01:20.460
So I don't know, I feel like building scalable software
00:01:20.460 --> 00:01:22.220
is really important, and still people are getting it
00:01:22.220 --> 00:01:23.500
quite wrong quite often.
00:01:23.500 --> 00:01:27.700
- Yeah, I mean, I think it's also,
00:01:27.700 --> 00:01:30.880
Because there's a lot of things what you want to do when you do that,
00:01:30.880 --> 00:01:34.400
which is like, well, I mean, write proper code for sure,
00:01:34.400 --> 00:01:37.280
but also you want to be able to measure everything,
00:01:37.280 --> 00:01:40.800
like to understand where the bottleneck might be.
00:01:40.800 --> 00:01:45.340
And that's not the easy part, like writing code and fixing bugs and stuff.
00:01:45.340 --> 00:01:49.440
We all know how to do that. But then if we are asking you to optimize,
00:01:49.440 --> 00:01:53.200
that's one of the things that I usually use as an example
00:01:53.200 --> 00:01:56.940
when I talk about profiling is like, well, if I were to ask you to more like,
00:01:56.940 --> 00:01:59.520
I want you to tell me which part of your code
00:01:59.520 --> 00:02:02.260
is using 20% of the CPU.
00:02:02.260 --> 00:02:03.100
You really don't know.
00:02:03.100 --> 00:02:04.460
You can guess.
00:02:04.460 --> 00:02:07.320
You can probably do a good guess most of the time,
00:02:07.320 --> 00:02:08.900
but for real, you don't know.
00:02:08.900 --> 00:02:11.660
You have no clue until you actually look at the data,
00:02:11.660 --> 00:02:14.860
use a profiler or any tool for that being
00:02:14.860 --> 00:02:16.660
that will give you this information.
00:02:16.660 --> 00:02:21.180
- Yeah, we're really bad in using our intuition
00:02:21.180 --> 00:02:22.060
for those things.
00:02:22.060 --> 00:02:24.380
I remember the most extreme example I ever had of this
00:02:24.380 --> 00:02:25.540
was I was working on this project
00:02:25.540 --> 00:02:29.320
that was doing huge amounts of math, wavelet decomposition,
00:02:29.320 --> 00:02:32.900
kind of like Fourier analysis, but I think kind of worse.
00:02:32.900 --> 00:02:34.920
And I thought, okay, this is too slow.
00:02:34.920 --> 00:02:37.860
It must be in all this complicated math area.
00:02:37.860 --> 00:02:39.500
And I don't understand the math very well,
00:02:39.500 --> 00:02:40.480
and I don't want to change it,
00:02:40.480 --> 00:02:42.180
but this is gotta be here, right?
00:02:42.180 --> 00:02:43.120
It's slow.
00:02:43.120 --> 00:02:44.780
And I put it into the profiler.
00:02:44.780 --> 00:02:47.060
And the thing that turned out was
00:02:47.060 --> 00:02:48.740
we were spending 80% of our time
00:02:48.740 --> 00:02:53.180
just doing like finding the index of an element in a list.
00:02:53.180 --> 00:02:54.300
- Yeah, which is--
00:02:54.300 --> 00:02:55.460
- It's insane.
00:02:55.460 --> 00:02:59.140
- Yeah, my favorite, I think, programming quote
00:02:59.140 --> 00:03:01.060
is from Donald Knuth, which is,
00:03:01.060 --> 00:03:04.220
"Early optimization is the root of all evil."
00:03:04.220 --> 00:03:07.300
Like, it's widely known, but I mean,
00:03:07.300 --> 00:03:11.560
I think I will quote it every week or so now.
00:03:11.560 --> 00:03:13.500
- Yeah, it's fantastic, it's fantastic.
00:03:13.500 --> 00:03:15.780
Yeah, in my case, we switched it to a dictionary
00:03:15.780 --> 00:03:17.820
and it went five times faster, and that was it.
00:03:17.820 --> 00:03:19.740
Like, it was incredibly easy to fix,
00:03:19.740 --> 00:03:22.280
but understanding that that was where the problem was,
00:03:22.280 --> 00:03:23.940
I would have never guessed.
00:03:23.940 --> 00:03:25.900
So yeah, it's hard to understand
00:03:25.900 --> 00:03:28.540
and we're gonna talk about finding these challenges
00:03:28.540 --> 00:03:30.620
and also some of the design patterns.
00:03:30.620 --> 00:03:32.380
You've written a really cool book
00:03:32.380 --> 00:03:34.740
called "The Hacker's Guide to Scaling Python"
00:03:34.740 --> 00:03:36.580
and we're gonna dive into some of the ideas you cover there.
00:03:36.580 --> 00:03:39.460
Also talk about some of your work at Datadog
00:03:39.460 --> 00:03:41.620
that where you're doing some of the profiling stuff,
00:03:41.620 --> 00:03:43.580
not necessarily for you internally,
00:03:43.580 --> 00:03:45.280
although I'm sure there is some,
00:03:45.280 --> 00:03:47.540
but it also could be for so many people
00:03:47.540 --> 00:03:50.300
like you guys basically have profiling as a service
00:03:50.300 --> 00:03:52.660
and runtime as a service,
00:03:52.660 --> 00:03:54.500
runtime analysis of the service, which is great,
00:03:54.500 --> 00:03:55.340
and we'll get into that.
00:03:55.340 --> 00:03:57.140
But before we do, let's start with your story.
00:03:57.140 --> 00:03:59.140
How'd you get into programming in Python?
00:03:59.140 --> 00:04:00.980
- That was a good question.
00:04:00.980 --> 00:04:04.740
So actually, I think I started like 15 years ago or so.
00:04:04.740 --> 00:04:07.340
I actually started to learn Perl,
00:04:07.340 --> 00:04:08.860
the first programming language,
00:04:08.860 --> 00:04:10.940
like, you know, kind of scripting language,
00:04:10.940 --> 00:04:13.420
like we used to call them at least a few years ago.
00:04:13.420 --> 00:04:16.700
And yeah, I mean, I like Perl,
00:04:16.700 --> 00:04:20.060
but I wanted to learn like object-oriented programming,
00:04:20.060 --> 00:04:23.420
and I never understood Perl object-oriented programming.
00:04:23.420 --> 00:04:26.540
Their model was so weird for me.
00:04:26.540 --> 00:04:28.860
Maybe because I was young and I don't know.
00:04:28.860 --> 00:04:32.700
But somebody talked to me about Python.
00:04:32.700 --> 00:04:38.460
I bought the book, the RE book about Python.
00:04:38.460 --> 00:04:41.900
And I kept it around for a year or so because I had no project at all,
00:04:41.900 --> 00:04:45.340
no idea. Most of my job back then was to be a sysadmin,
00:04:45.340 --> 00:04:47.820
so not really anything to do with Python.
00:04:47.820 --> 00:04:51.500
And some day I was working on DPM,
00:04:51.500 --> 00:04:53.620
like the line distribution,
00:04:53.620 --> 00:04:56.540
and I was like, oh, I need to do something like a new project
00:04:56.540 --> 00:04:58.540
and I'm going to do that with Python.
00:04:58.540 --> 00:05:00.700
And I started to learn Python this way
00:05:00.700 --> 00:05:03.900
with my project on one side, the book on the other side.
00:05:03.900 --> 00:05:05.940
I was like, that's amazing, I love it.
00:05:05.940 --> 00:05:08.380
And I never stopped doing Python after that.
00:05:08.380 --> 00:05:09.660
- That's fantastic.
00:05:09.660 --> 00:05:12.020
It feels like it very much was a
00:05:12.020 --> 00:05:14.900
automate the boring stuff type of introduction.
00:05:14.900 --> 00:05:16.100
Like there's these little problems
00:05:16.100 --> 00:05:20.100
and Bash is too small or too hard to make it solve those problems.
00:05:20.100 --> 00:05:24.100
What else could I use? And Python was a good fit for that.
00:05:24.100 --> 00:05:28.100
- Yeah, it's a great way. I have a lot of people coming to me over the years
00:05:28.100 --> 00:05:32.100
and being like, "Julien, I want you to contribute to a project.
00:05:32.100 --> 00:05:36.100
I want to start something in Python. What should I do?" I don't know. What's your problem you want to solve?
00:05:36.100 --> 00:05:40.100
If you want to find a boring thing you want to automate or anything that's the best
00:05:40.100 --> 00:05:44.100
idea you can have to... If it's an open source project that exists already,
00:05:44.100 --> 00:05:45.940
I mean, good for you, it's even better,
00:05:45.940 --> 00:05:49.220
but I mean, just write a script or whatever you want to,
00:05:49.220 --> 00:05:51.140
to start hacking and learning.
00:05:51.140 --> 00:05:54.700
That's the best ways to scratch your own itch.
00:05:54.700 --> 00:05:56.340
- Yeah, absolutely.
00:05:56.340 --> 00:05:57.460
It's so easy to think of,
00:05:57.460 --> 00:05:58.800
well, I want to build this great big thing,
00:05:58.800 --> 00:06:01.260
but we all have these little problems that need solving
00:06:01.260 --> 00:06:04.180
and it's good to start small and practice small
00:06:04.180 --> 00:06:07.140
and build up and yeah, I find it really valuable.
00:06:07.140 --> 00:06:09.040
People often ask me like, oh, I want to get started.
00:06:09.040 --> 00:06:09.880
What should I do?
00:06:09.880 --> 00:06:11.140
Should I build a website like this?
00:06:11.140 --> 00:06:13.420
Maybe a machine learning thing like that.
00:06:13.420 --> 00:06:14.260
I'm like, whoa, whoa, whoa.
00:06:14.260 --> 00:06:16.460
Like that's, yes, you definitely want to get there.
00:06:16.460 --> 00:06:18.340
But you know, if you're really, really just starting,
00:06:18.340 --> 00:06:20.100
like, you know, don't kill yourself
00:06:20.100 --> 00:06:21.540
by trying to take on too much at once.
00:06:21.540 --> 00:06:24.260
So it sounds like it worked well for you.
00:06:24.260 --> 00:06:25.100
How about now?
00:06:25.100 --> 00:06:25.920
What are you doing day to day?
00:06:25.920 --> 00:06:26.860
I hinted at Datadog.
00:06:26.860 --> 00:06:30.780
- Yeah, so I've been doing Python for,
00:06:30.780 --> 00:06:32.340
I mean, the next 10 years,
00:06:32.340 --> 00:06:35.500
after I learned Python, I've been working on OpenStack,
00:06:35.500 --> 00:06:37.660
which is a huge Python project,
00:06:37.660 --> 00:06:41.220
implementing a open cloud system
00:06:41.220 --> 00:06:44.220
where you can host your own AWS, basically.
00:06:44.220 --> 00:06:46.220
And so everything is in Python there.
00:06:46.220 --> 00:06:52.220
So I work on one of the largest Python projects,
00:06:52.220 --> 00:06:54.220
which is up on the stack for a few years.
00:06:54.220 --> 00:06:57.220
And then I decided to go for a change.
00:06:57.220 --> 00:07:01.220
And then I was looking into building a profiling team,
00:07:01.220 --> 00:07:04.220
building a profiler, a continuous profiler,
00:07:04.220 --> 00:07:08.220
which means you would not profile your script on your laptop,
00:07:08.220 --> 00:07:11.660
to profile your application running on your production system for real.
00:07:11.660 --> 00:07:17.180
And I was like, "That's not something I think anyone did before in Python, at least.
00:07:17.180 --> 00:07:20.540
So I want to do that." And that's what I started to do two years ago.
00:07:20.540 --> 00:07:22.860
That's really interesting because normally,
00:07:22.860 --> 00:07:27.580
you have this quantum mechanics problem with profilers and debuggers,
00:07:27.580 --> 00:07:32.180
especially profilers like the line-by-line ones so much
00:07:32.180 --> 00:07:34.460
where it runs at one speed normally,
00:07:34.460 --> 00:07:37.060
and then you hit it with CProfile or something,
00:07:37.060 --> 00:07:39.560
and it's five times slower or whatever it turns out to be.
00:07:39.560 --> 00:07:43.680
And you're like, whoa, this is a lot slower.
00:07:43.680 --> 00:07:48.680
Hopefully it gives you just a factor of slowness over it.
00:07:48.680 --> 00:07:53.040
Like if it says it spends 20% here and 40% there,
00:07:53.040 --> 00:07:55.720
hopefully it's still true at normal speed,
00:07:55.720 --> 00:07:58.080
but sometimes it really depends, right?
00:07:58.080 --> 00:08:00.060
Like if you're calling a function
00:08:00.060 --> 00:08:02.240
that goes out of your system and that's 20%,
00:08:02.240 --> 00:08:04.880
and then you're doing a really tight loop with lots of code,
00:08:04.880 --> 00:08:07.440
the profiler will introduce more overhead
00:08:07.440 --> 00:08:10.840
in your tight loop part than it will in the external system
00:08:10.840 --> 00:08:13.120
where it adds basically zero overhead.
00:08:13.120 --> 00:08:17.640
And so that's a big challenge of understanding profiling
00:08:17.640 --> 00:08:21.240
results in general, and it's a really big reason
00:08:21.240 --> 00:08:24.300
to not just run the profiler constantly in production.
00:08:24.300 --> 00:08:26.440
- Yeah, exactly.
00:08:26.440 --> 00:08:30.800
It's, I mean, and people do that now.
00:08:30.800 --> 00:08:33.120
I mean, if you have the right profiler,
00:08:33.120 --> 00:08:36.400
The way CProfile works, and we can dig a bit into that,
00:08:36.400 --> 00:08:39.200
but CProfile, the way it works, it's going to intercept everything.
00:08:39.200 --> 00:08:41.200
It's what we call a data mystic profiler,
00:08:41.200 --> 00:08:44.000
where if you run the same program twice,
00:08:44.000 --> 00:08:46.400
you will get the same CProfile output for sure.
00:08:46.400 --> 00:08:49.600
It's intercepting all the function calls that you have.
00:08:49.600 --> 00:08:51.680
So if you have a ton of function calls,
00:08:51.680 --> 00:08:53.840
it makes things, like you were saying,
00:08:53.840 --> 00:08:56.000
five times slower for sure at least.
00:08:56.000 --> 00:08:58.880
And it'll inject a little bit of bytecode
00:08:58.880 --> 00:09:00.640
at the beginning and end of every function,
00:09:00.640 --> 00:09:02.880
all sorts of stuff, and it actually changes what happens.
00:09:02.880 --> 00:09:07.880
Yeah, exactly. So it can change the timing.
00:09:07.880 --> 00:09:10.880
So it's a good solution to have a
00:09:10.880 --> 00:09:14.880
blackboard estimate of what's going on
00:09:14.880 --> 00:09:16.880
and it gives you pretty good results.
00:09:16.880 --> 00:09:17.880
Usually it's a good tool. I used it a lot of times over the years.
00:09:17.880 --> 00:09:22.880
It always gave me good information.
00:09:22.880 --> 00:09:25.880
The problem with C++ is that you can't use it in production
00:09:25.880 --> 00:09:27.880
because it's too slow.
00:09:27.880 --> 00:09:28.880
It's also not providing information.
00:09:28.880 --> 00:09:31.880
It gives you the world time that you use,
00:09:28.880 --> 00:09:31.880
but not necessarily the CPU time of each of your thread,
00:09:31.880 --> 00:09:33.880
et cetera, that you're going to use.
00:09:33.880 --> 00:09:35.880
So the information is not really fine-grained.
00:09:35.880 --> 00:09:37.880
It's a rough--
00:09:37.880 --> 00:09:39.880
- Yeah, it's probably not streaming either, right?
00:09:39.880 --> 00:09:41.880
It runs and then gives you the answer,
00:09:41.880 --> 00:09:45.880
not some sort of real-time stream of what's happening.
00:09:45.880 --> 00:09:48.880
- So for one of the case, like we were mentioning previously,
00:09:48.880 --> 00:09:51.880
where you know this part of the code,
00:09:51.880 --> 00:09:53.880
like this scenario that you can recreate
00:09:53.880 --> 00:09:55.880
in a one-minute script or something,
00:09:55.880 --> 00:09:57.880
it's slow and it should take only 30 seconds,
00:09:57.880 --> 00:10:01.560
seconds, you can run CProfiler on it, on one minute on your laptop and be okay,
00:10:01.560 --> 00:10:04.680
I'm going to show you this piece of code. Then if you want to see what's
00:10:04.680 --> 00:10:07.400
happening on prediction with a real workload for real,
00:10:07.400 --> 00:10:11.000
and like you were saying, streaming the data to see
00:10:11.000 --> 00:10:14.600
in real time what's going on, well CProfiler doesn't fit.
00:10:14.600 --> 00:10:18.520
And also, any data mystic profiler which tries to catch
00:10:18.520 --> 00:10:22.680
everything your program does will not work with good performance. So you have
00:10:22.680 --> 00:10:27.560
to do another approach, which is what most profiling profilers
00:10:27.560 --> 00:10:30.800
for continuous profiling, which is statistical profiling,
00:10:30.800 --> 00:10:32.560
where you actually sample your program
00:10:32.560 --> 00:10:35.680
and you try to look what it does most of the time
00:10:35.680 --> 00:10:36.520
and the most of them.
00:10:36.520 --> 00:10:38.440
So it's not a true representation.
00:10:38.440 --> 00:10:40.520
It's not like the reality 100%.
00:10:40.520 --> 00:10:41.960
It's a good statistical approach
00:10:41.960 --> 00:10:44.040
of what your program is doing most of the time.
00:10:44.040 --> 00:10:47.780
- I see, is that more of the sampling style of profilers
00:10:47.780 --> 00:10:51.860
where it's like every 200 milliseconds,
00:10:51.860 --> 00:10:53.000
like what are you doing now?
00:10:53.000 --> 00:10:54.120
What are you doing now?