<    June 2018     >
Su Mo Tu We Th Fr Sa  
                1  2  
 3  4  5  6  7  8  9  
10 11 12 13 14 15 16  
17 18 19 20 21 22 _2_3  
24 25 26 27 28 29 30
02:19 rpifan joined
04:34 xtreak joined
04:48 SkyRocknRoll joined
05:06 xtreak joined
05:18 xtreak joined
06:55 tobixx joined
07:23 <tobixx> Hi, need some advice, can't get arround it. The problem: over size limit, the task: find duplicate documents for a time period, the important data: precalculatet md5 of the document, timestamp and compound index on both, so this would at least give me duplicated request per second but due to the result size I can't do anything usefull with it like filtering based on count of duplicates or sort and limit. Any thoughts ?
07:58 rendar joined
08:55 xtreak joined
10:34 xtreak joined
11:20 Anticom joined
11:47 Brett[Air] joined
11:57 Brett[Air] joined
12:04 rpifan joined
14:40 FetaMight joined
16:19 SkyRocknRoll joined
16:22 bobh joined
18:21 rpifan joined