<    April 2017    >
Su Mo Tu We Th Fr Sa  
                   1  
 2  3  4  5  6  7  8  
 9 10 11 12 13 14 15  
16 17 18 19 20 21 22  
23 24 25 26 27 28 29  
30
00:06 vaibhavsagar joined
00:14 spicydonuts joined
00:14 zph joined
00:15 dni- joined
00:18 takle joined
00:25 takle joined
00:34 takle joined
00:36 baweaver joined
00:43 takle joined
00:43 ridho joined
00:51 takle joined
00:52 sigmundv joined
01:01 vaibhavsagar joined
01:01 takle joined
01:01 conal joined
01:02 ridho joined
01:04 conal joined
01:10 takle joined
01:10 GreySunshine joined
01:11 twopoint718 joined
01:14 conal joined
01:24 takle joined
01:28 sigmundv joined
01:31 newhoggy_ joined
01:36 takle joined
01:40 NoCreativity joined
01:40 Youmu joined
01:41 aarvar joined
01:41 takle joined
01:47 Big_G joined
01:48 pixelfog_ joined
01:48 newhoggy joined
01:49 takle joined
02:01 newhoggy joined
02:03 takle joined
02:04 dni- joined
02:12 GreySunshine joined
02:31 takle joined
02:35 GreySunshine joined
02:39 takle joined
02:40 NeverDie joined
02:46 exferenceBot joined
02:47 takle joined
02:47 <lpaste> saylu pasted “Horrible record transformation” at http://lpaste.net/355018
02:48 <saylu> Hey guys -- I have this truly awful bit of code and I would love some advice on refactoring.
02:49 <saylu> I have a map of source URLs to the second record type with a bunch of data. I'm going through the map and, depending on what 'category' the URL is associated with, aggregating all of the data into a total for that category.
02:49 <geekosaur> one thing you can do is use the field names for the local bindings, enable RecordWildcards, and replace the field list with CategoryTotal {..}
02:49 <saylu> The code works just fine -- we've used it often this week -- but it's awfully clumsy. Any obvious improvements I could make?
02:50 <saylu> geekosaur: I'll take a look at the extension
02:50 hexagoxel joined
02:52 <saylu> geekosaur, I'm not sure what you mean
02:52 <saylu> What do you mean by "replace the field list"?
02:53 <geekosaur> instead of having count = ... in the where, have clCount =
02:53 <geekosaur> so that the locla binding's name is the same as the field name
02:53 <geekosaur> because then the record wildcard {..} will pair up the field names with the local bindings that have the same name
02:55 <saylu> Like this?
02:55 <saylu> https://www.irccloud.com/pastebin/WHlfmkd3/
02:57 <lpaste> geekosaur annotated “Horrible record transformation” with “Horrible record transformation (annotation)” at http://lpaste.net/355018#a355019
03:00 <geekosaur> there may be ways to cut down on the actual transformations, I didn't look very closely for repeated patterns
03:01 <geekosaur> actually it looks like something is possible there too
03:01 <geekosaur> but my local network is sucking enough that loading web pages is taking forever...
03:01 <saylu> With RecordWildcards, does the order matter?
03:03 <vaibhavsagar> saylu: no
03:03 <glguy> saylu: You know you can construct a record value like this right? : CategoryTotal { ctThing = 1; ctThat = 2 }
03:03 takle joined
03:04 <__red__> umm, silly question re: Lambda Calculus in the haskellbook.com book. is \x.(\y.xy) equivalent to (\x(\y.xy)) ?
03:05 <__red__> it's a jump in the text I couldn't follow
03:05 <glguy> __red__: Your second bit missing the '.' is a syntax error
03:05 <saylu> glguy: I suppose that would be an easy improvement :)
03:05 <__red__> glguy: that's what I thought... but that's what the book has
03:05 <__red__> in a few places
03:06 <lpaste> glguy annotated “Horrible record transformation” with “Horrible record transformation (annotation)” at http://lpaste.net/355018#a355020
03:06 <glguy> __red__: Well, they're probably just a few typos to overlook
03:07 <glguy> But [[ \x.(\y.xy) ]] and [[ (\x.(\y.xy)) ]] are the same
03:07 <glguy> You can't arbitrarily add and remove ()s, but you can on the outside
03:07 <glguy> They're just for grouping. Maybe that was your question?
03:08 <__red__> I think the syntax is causing me great consternation - since my brain is still hard-wired for parens meaning something, . meaning something else... and two letters next to each other meaning something else
03:09 <geekosaur> saylu, I see a different issue with your paste, you seem to be missing a const somewhere or are otherwise using \x -> where it should be \x y ->
03:09 <glguy> Well, don't worry about the '.' part too much, Haskell uses a '->' there
03:09 <__red__> and then when I see as an example: \x.x+1
03:09 <geekosaur> e.g. totalSessions
03:09 <__red__> my brain doesn't know whether to parse that as f(x) = x + 1
03:09 <lpaste> saylu annotated “Horrible record transformation” with “Horrible record transformation (annotation) (annotation)” at http://lpaste.net/355018#a355021
03:09 <__red__> or... f(x) = x 1
03:09 <__red__> or f(1) = 1
03:10 <saylu> I see, I can just parenthesize it as well
03:10 <saylu> though it's less intelligible -- I prefer your record "=" syntax glguy
03:10 <__red__> glguy: When the penny drops it will drop
03:10 <thang1> Also alignment goes a long way...
03:10 <glguy> __red__: In Haskell, the lambda expression extends as far to the right as it can
03:11 <__red__> I just wish I knew the "proof" for \xy.xy being equivalent to \x.(\y.xy)
03:11 <__red__> I thought it was left-associative?
03:11 <glguy> __red__: There's no proof, that's just the way the syntax is defined
03:11 <saylu> geekosaur: here? `Just $ M.foldr (\x -> (+ clSessions x)) 0 filtered`
03:11 <glguy> __red__: One is just a shorthand for the other
03:11 <__red__> so how do you know it works for all functions?
03:12 <glguy> __red__: How do I know that what works for all functions?
03:12 <geekosaur> yes. M.foldr will have two parameters for the lambda, `const` eats one of them which is why const (+1) for the count
03:12 texasmynsted joined
03:12 <glguy> __red__: *function application* is left-associative
03:13 <geekosaur> and my network is still screwed up so I'm not loading stuff any more :/
03:13 <saylu> Won't this --> `(+ clSessions x)` be awaiting the second parameter and immediately apply it when it arrives?
03:13 <saylu> The code does work just fine as written (though it's clunky)
03:13 <saylu> have a partially applied (+) in that case
03:14 <saylu> No worries geekosaur!
03:14 <saylu> Good luck with the connection )
03:14 <saylu> *:)
03:14 <glguy> saylu: (\x y -> y + stuff x) is nicer to read than (\x -> (+ stuff x))
03:15 <saylu> Makes sense. An easy fix, thanks
03:15 <glguy> I think even (+) . stuff is nicer than (\x -> (+ stuff x)) -- though I'd recommend the one above
03:16 <__red__> glguy: So if you can only do substitution in lambda calculus... there has to be a way to substitute your way from \xy.xy to \x.(\y.xy) - right?
03:16 <glguy> no
03:16 <glguy> one is just a shorthand for the other
03:17 <glguy> there's no substitution
03:17 <glguy> It's just saying that writing: \x. \y. x y gets tiresome
03:17 takle joined
03:17 <glguy> so let's agree that's the same as \x y. x y
03:19 <__red__> glguy: ahhhh
03:20 <__red__> so the root of my misunderstanding here (I hope) is that since we really only want to deal with single argument functions (but we still want to be able to write them)
03:20 <__red__> we "lie" in how we write them down but then when we do the actual math we write them down properly?
03:21 <thang1> If you're doing formal math for a math paper you'll generally do it "properly"
03:21 <__red__> glguy: I notice you used spaces in there... \x y. x y
03:21 <thang1> However, I see this as very similar to physicist vs mathemetician notation
03:21 <saylu> glguy: Any improvement to a function like this: `M.foldr (const (+1)) 0`?
03:21 <glguy> __red__: I used spaces because 1) you'll need spaces when you start writing actual Haskell and 2) there's enough opportunity for confusion already without them
03:21 <thang1> For example, mathemeticians always write dy/dx but physicists will often just write dy because the dx is already implied, or they'll write dyx or dy_x
03:22 <saylu> I don't want to sum the map; I want to count incidences.
03:22 <__red__> I wonder why the author didn't format it with spaces then
03:22 <__red__> lemme check the other articles he cites - see how common each format is
03:22 <glguy> Because the book was already long enough? ^_^ I don't know
03:23 <__red__> although if haskell uses spaces, I should get that in my head anyways
03:23 <glguy> __red__: It's a bit philosophical about what it all means.
03:23 <glguy> \x. x is syntax that defines a function, it's not really "the function"
03:24 <__red__> I may be overthinking it. It's a personality failure ;-)
03:24 <thang1> Which book is this? :) Sounds like haskell from first principles
03:24 <__red__> it is
03:24 <thang1> The first chapter is just designed to get you sort of thinking in lambda calculus terms
03:25 <thang1> I wouldn't worry super hard about understanding it all. Just read through it, try some stuff out, and move on. Go back to it again after chapter 3-4 or so and you'll have more "aha" moments.
03:25 <__red__> thang1: Yeah - I think I will.
03:26 <thang1> The book is somewhat designed to do that. Further, they write everything in chapter 1 in pure and traditional math notation. So no spaces, etc.
03:27 <__red__> although interestingly... I wrote a function in ghci whose output had no influence from the input
03:28 <__red__> I wondered if internally it actually did anything with the value I passed it or whether the compiler just replaced the function with the equivalent of "always return this Num"
03:29 <thang1> @let function a = 4
03:29 <lambdabot> Defined.
03:29 <thang1> > function 15
03:29 <lambdabot> 4
03:29 <thang1> > function 1245
03:29 <lambdabot> 4
03:29 <thang1> > function "spaghetti nonsense"
03:30 <lambdabot> 4
03:30 <__red__> exactly
03:30 <thang1> Yep, it throws it away.
03:30 <thang1> Equivalent: function _ = 4
03:30 <thang1> where _ is a type-hole that tells the compiler "I don't care what goes here, just throw it away"
03:30 <__red__> I wonder if you did something like (forgive me - pseudocode because I'm still at Chapter 1 here)
03:30 <thang1> :t function
03:30 <lambdabot> Num t => t1 -> t
03:31 <__red__> function a = (whole bunch of really long, expensive calculations on a) ; return 4
03:31 <glguy> thang1: The _ pattern is better known as a "wildcard" pattern
03:31 <__red__> whether the compiler would actually generate the code
03:31 <__red__> or optimize out everything except the "return 4"
03:32 <thang1> sorry, you're right. My bad. You use _ as a typehole in other places and I get it mixed up every now and then :p
03:32 <glguy> The "typed" hole is when _ is used as an expression
03:32 <__red__> listen to me, I can't even write a damn function and I'm already asking about compiler optimization implementation
03:32 <* __red__> is going to go read a book.
03:32 <thang1> __red__: You can do something like that, sure
03:33 <thang1> The important thing is that in haskell you wouldn't really ever actually do that
03:33 <thang1> abstraction is basically free so there's no reason to ever have a function do more than exactly one thing
03:33 <__red__> I'm attracted to haskell because I want a safe systems language
03:33 malaclyps joined
03:33 <__red__> that's the domain I'm aiming for.
03:34 <thang1> That is, if I were to write a sum function that sums up a list of Ints, I wouldn't write: sum [list] = a) go through list, b) sum up items, c) do other thing, d) do other thing, e) return result
03:34 <__red__> I mean, I could do rust... but since I do erlang and elm for server + browser... haskell seemed the natural choice
03:34 <thang1> I was gonna say, the safe systems language is honestly Rust, not Haskell
03:35 <thang1> Now if all you want is a safe compiled language that's comparable to python in terms of speed? Haskell's your man
03:35 <__red__> thang1: I want something I can compile across multiple platforms to executables... make calls to native APIs etc
03:35 <__red__> speed isn't as important - honestly
03:36 <__red__> being BugFree[tm] is.
03:36 <glguy> saylu: It looks like you just want the Prelude.length function or Data.Map.size
03:36 <__red__> is haskell really that slow?
03:37 <glguy> No, it's not comparable to Python
03:37 <glguy> At least not similar, lots of stuff can be "compared"
03:46 <thang1> Haskell is often faster than Python for a lot of things, slower for some things, and getting a bit faster overall
03:47 <thang1> idk, I feel like comparing it to python gives the right expectation for things; you'll be pleasantly surprised more often than not. Definitely better than people pretending it was as fast as C "with only a little bit of work"
03:48 <thang1> (insert long blog post about manually unrolling, unboxing types, looking at assembly code, configuring manual rewrite rules, etc)
03:48 <thang1> But glguy is definitely right. You can't really compare the two languages because they're so different in approach.
03:53 <__red__> It's funny... I don't write code in erlang for performance... I write in it for scalability.
03:53 dni- joined
03:53 <__red__> I had a vendor write an app that processed an event-stream - about 100k eps
03:53 <__red__> it flattened a 16 core server with 64G RAM.
03:54 <__red__> Mine, in erlang did it in 5% CPU and 40M of RAM
03:54 <thang1> Exactly. Write your program in the language which has the features you want
03:54 <__red__> exactly
03:54 <thang1> Damn, that's impressive
03:54 <__red__> Problem Domain => Problem Language
03:54 <thang1> Did your vendor guy write his in Java and JS or something?
03:54 <__red__> Python then Go
03:54 <thang1> lol
03:55 <__red__> honestly, I doubt his Go implementation was written as an experienced Go programmer would... to be completely fair to the language
03:55 eacameron joined
03:55 <__red__> thang1: It connected via AMQ, took a stream of events encoded with google protobufs, did some regexes and then emitted the data to different servers via UDP depending on patterns
03:56 <__red__> thang1: my original bottleneck was the protobufs decoding... a single CPU wasn't fast enough to do the decoding so I ended up spawning a new VM process per message
03:56 <__red__> so yeah - I'm spawning 100,000 processes per second
03:57 <__red__> and it's still at less than 5% of CPU and 40M (yes, Meg) of memory
03:57 <__red__> the VM's spawn instruction is stupidly lightweight
03:57 <__red__> anywasy - sorry = I digress
03:58 <__red__> but - back to haskell - I read that someone actually managed to write a kernel module in haskell
03:58 <thang1> eww
03:58 <__red__> so I figure if that's possible... (even with some contortion)... then writing something that can call basic C libraries should be reasonable
03:58 <thang1> (about the go thing)
03:58 <thang1> Oh you can definitely do some FFI with Haskell
03:59 <__red__> I really don't understand the Go concurrency model
03:59 <__red__> it seems like they grasped fail from the jaws of awesome.
03:59 <thang1> It's not a very good one, in my opinion, but I'm somewhat biased against the language
03:59 <thang1> https://wiki.haskell.org/Foreign_Function_Interface
03:59 <thang1> ^ this is what you use to fuck around with C/C++ code in Haskell, for the most part
03:59 <__red__> Thanks
04:00 takle joined
04:00 <thang1> The FFI is how to inject one language into another and it's a generic property of most modern languages. Python has CPython and Cython which wrap the FFI up nicely. Rust has a C FFI, C++ has a C FFI, Haskell and most other modern languages have a C FFI
04:01 <__red__> thang1: I've done the equivalent in Perl (both .xs and Inline.pm), and using Ports in erlang (I refuse to use NIFs)
04:01 <__red__> so the concepts I'm sure will be pretty much identical
04:02 <__red__> I'm following my own career advice. Minimum of one language per problem domain.
04:03 <thang1> Yup, it will be. One thing you'll want to be aware of is learning types really well. C has almost zero type safety so you'll want your FFI injection to be guarded as closely as possible in the haskell level and made as pure as possible
04:05 <thang1> mallocForeignPtr is also really useful. It basically turns any C pointer into a smart pointer that cleans up after itself.
04:06 <__red__> thang1: my first project will probably be some code to push to my enterprise and steal^Wdownload useful security data for audit.
04:06 <* __red__> nods.
04:07 <thang1> nice nice
04:07 takle joined
04:07 <thang1> Anyway, haskell from first principles is an amazing book so far. Really, the best beginner programming intro I've seen for any language so far
04:08 <__red__> probably mass statistical analysis of registries looking for "one of these things is not like the other"
04:08 <thang1> If you want to do mass statistical analysis, you're probably better off using a language with a really strong data science library, like Python
04:08 <__red__> 80% common - "normal
04:08 <__red__> 18% ish - "damn users"
04:09 <__red__> 2% Crap - "Now I'm not going home"
04:09 <__red__> thang1: computation will be on the server that collects the data
04:09 <__red__> haskell will be pushed and executed on end-points
04:09 <thang1> It's not so much the computation as the fact that haskell doesn't have tons of prebuilt functions and libraries for making stats painless
04:10 <__red__> Honestly, I'm likely to use erlang for that - not ness because it has the best data analysis libraries but because my fluency there is significant
04:10 <thang1> nvm, I lied
04:10 <thang1> http://hackage.haskell.org/package/statistics
04:10 <__red__> hah
04:10 <__red__> so - I work at a software company
04:10 <__red__> and our data sciences people have a copy of "Learn you a Haskell" on the shelf
04:11 <__red__> I've not met them yet - hopefully in a few weeks (different office)
04:11 <thang1> Kendall correlation, 15 differnt distribution functions, root finding, matrix shit, what have ya. All the standard Test's
04:11 <__red__> but will be nice to meet some functional people in a world of Java misery
04:11 <thang1> __red__: Honestly, I really dislike LYAH.
04:12 <thang1> Not gonna lie, I think the book is stupid and way too outdated. Even code from 2010-2013 is outdated at times. Also, a lot of people I've talked to have complained that they went through the book and felt like they really "got" haskell and then utterly crashed and burned when they tried to use it for real stuff
04:12 <thang1> Not so for the first principles book :p
04:12 <__red__> I've heard others mention that.
04:13 <__red__> honestly, I'm the kind of person that tends to get stuck in a book (regardless of book because I overthink)
04:13 takle joined
04:13 <__red__> so I get several of them and rotate around them
04:13 <thang1> That's actually why I like HFFP so much. It's slow enough, yet speedy enough that even when I get stuck I can just skip ahead to somewhere where I stop getting stuck
04:13 <__red__> get stuck in book A, move to B. Stuck in B, move to C. Usually - enough exposure from B+C that when I move back to A I'm unstuck
04:13 <thang1> and then you can go back
04:14 <__red__> I'm not far enough in the book to understand the link between Lambda Calculus and Haskell yet and I'll take y'all at your word that it's fundamental
04:14 <thang1> The reason HFFP starts from lambda calculus
04:15 <__red__> but my brain keeps saying: "You used Lambda Calculus to Simplify Haskell... now you're teaching me two impossible things"
04:15 <__red__> apologies to jwz for butchering his regex quote
04:15 <thang1> is because Haskell /is/ lambda calculus. That is, there is literally nothing we can do in haskell that cannot be rewritten identically (functionality wise) in lambda calculus
04:15 <__red__> "You solve a problem with a regex, now you have two problems"
04:15 <thang1> haha fair :p
04:16 <thang1> But for instance, all functions that take multiple arguments are actually syntactic sugar for chained functions
04:16 <__red__> I'm also working on the principle that the "Piper must be paid"
04:16 <__red__> kinda like with strict typing. Getting something to compiler can be a misery, but it's better misery there than bugs for your users.
04:16 <thang1> a -> b -> c is really a function that takes type a and returns a function that takes type b and returns c
04:17 <__red__> going from imperative to erlang/elixir required a complete brain re-write in how I solved problems.
04:17 <__red__> I'm expecting haskell to require the same level of pain
04:17 <thang1> Ideally in Haskell, when you get to understand the compiler a bit more and the language a bit more, you'll find that the strict typing actually helps you solve things
04:17 <thang1> For instance, the book will have you write out several functions, given only the type signature
04:18 <__red__> thang1: like elm (which is based on haskell I believe)
04:18 <__red__> and NixOS expressions are very haskelly too
04:18 <thang1> A function of type: a -> a takes a and returns a. What can that do? Literally nothing except return a. Congrats, it's the identity function. That sort of thing :)
04:18 <__red__> I'm told
04:18 <thang1> NixOS expressions should be quite haskelly considering it's based off of it :p
04:19 <thang1> I wouldn't say Elm is based on Haskell so much that it's inspired by it.
04:19 <__red__> I'm really torn between NixOS and Guix
04:19 <thang1> (NixOS all the way)
04:19 <__red__> on the one hand, NixOS is based on Haskell, Guix on LISP (score NixOS)
04:20 <thang1> After all, just because a language is imperetive doesn't mean it's a C-clone :p
04:20 <__red__> NixOS has systemd, Guix has sheperd (score Guix)
04:20 <thang1> On one hand, NixOS is actually usable and Guix has been "semi working" for 20 years /s
04:20 <__red__> but systemd man
04:20 <__red__> :-(
04:20 <thang1> Systemd is the best thing to ever happen to init systems. fitemeirl
04:21 <__red__> ...
04:21 <__red__> œ∑´®†\¨ˆøπ“
04:21 <thang1> Sure, I know a lot of people bitched about it, but bah gawd did it ever make system administration WAAAY easier. Everything else was just piles of hacky shell scripts
04:21 <__red__> Oh wow, you're a world of wrong but I'll still love you :-P
04:21 <thang1> haha, don't worry, I hate a lot of things about systemd too
04:22 <thang1> Like, I know what I'm saying is a white lie and only a mostly-truth :p
04:22 <__red__> My biggest issue with systemd is that it's eating everything - badly
04:22 <__red__> if it was just init, I honestly wouldn't care that much
04:22 <__red__> but when they add "let's do a root shell for systemctl because su doesn't handle environmental variables properly..."
04:23 <thang1> My biggest gripe is not that it's eating everything, but that it's very aggressively making sure that it's systemd or the highway. Forget about making your API stable, just subtly fuck with everyone until nobody can build an injection into systemd that works consistently
04:23 <__red__> (someone teach Pottering the difference between su and su -
04:23 <__red__> ...
04:23 <thang1> that'll do it /s
04:23 <__red__> or the sheer arrogance
04:23 <thang1> Pottering is incredibly good at building very stable tools and bringing a unique vision to fruitation
04:23 <__red__> yeah... rm -rf /foo/.*
04:23 <thang1> I'll give him that
04:24 <__red__> You're just trolling me now
04:24 <__red__> seriously.
04:24 <thang1> I just wish it didn't come with the god complex of "if I built it correctly, everyone else is wrong"
04:24 <thang1> actually I take back the stable tool thing
04:24 <thang1> he's good at bringing a vision to fruitation :p
04:24 <__red__> sure :-)
04:25 <thang1> (I meant stable as in "only one tool" sorta way but that wasn't the right word for it)
04:25 <__red__> the fundamental problem is that he's conflated desktop / laptop with server requirements making server administration substantially more complex.
04:25 <__red__> predictable interface names :-P
04:26 <thang1> Sure, the upside is that if you can configure it once it'll work on everything
04:26 <__red__> a log management tool which rm -rf 's your entire server
04:26 <thang1> The downside is now you gotta figure out how to configure it :p
04:26 <__red__> Well, it will work until you plug a USB device in and it renames all your damn ethernet interfaces
04:26 <thang1> usb still does that?
04:26 <thang1> fukkin c'mon
04:27 <__red__> or my favorite...
04:27 <__red__> change the default behaviour for UNIX processes for the last 40 damn years
04:27 <thang1> What was the change, again?
04:27 <__red__> when a user logs out, kill all their processes... and set the default to kill them all
04:27 <__red__> so, I log onto my server... start my app - log off
04:27 <__red__> systemd kills my app
04:28 <thang1> oh... yeah
04:28 <__red__> log onto my server, start a screen / tmux session with irc... log off, kills my sessions
04:28 <thang1> horribly convenient if you're on a laptop or something
04:28 <__red__> "screen and tmux develpers need to add systemd support to their app"
04:28 <__red__> seriously?
04:28 <thang1> Stupidly moronic if you're on a server
04:28 <__red__> thang1: again - that's the fundamental issue.
04:28 <__red__> the man DOESN'T OWN a server
04:29 <__red__> he said it in an intervew last year
04:29 <thang1> And expecting the entire world to bow to his preferred way to do things... seems stupid
04:30 <__red__> "I don't think people who were using computer 20 years ago really understand how computers work now because of how much computers have changed"
04:30 <__red__> seriously.
04:30 <thang1> lol
04:30 <* __red__> brushes his beard and screams "Gerrroff mah lawwwn"
04:30 <thang1> He does know that like a signifcant percentage of the world's computing power is still on fucking MAINFRAMES, right?
04:30 <__red__> doubt it
04:31 <__red__> funny story - I trolled my internal vulnerability team a few years ago. I dropped a mainframe on wifi at work
04:31 <thang1> wait what
04:31 <__red__> I was like: "I wonder if they'll just assume it's a false positive and ignore it or actually investigate"
04:31 <__red__> they ignored it
04:31 <thang1> like you got on the wifi and then disconnected the mainframe from the network? Or you just added the mainframe to the network?
04:32 <__red__> Mainframe in an emulator on the network
04:32 <__red__> you can run mainframes on your laptop
04:32 <thang1> oh right, true
04:32 <__red__> (obviously - you don't get the IO performance which is really what you buy big-iron for)
04:32 <__red__> well, used to
04:33 <__red__> I guess now you buy mainframes because nobody remembers how your application works ;-)
04:33 hphuoc25 joined
04:33 <thang1> SSDs are finally getting rid of the last reason to get mainframes now :p
04:34 <thang1> Nah, now you buy mainframes because your app is already coded in COBOL and you can't be arsed to rewrite it--bugs and all--into a modern language because everything is coded *just so* and even the slightest change in moon-phases ruins everything
04:34 takle joined
04:34 <__red__> as a penetration tester, I <3 mainframes
04:34 <__red__> :-)
04:35 <thang1> Why? (I'd guess because they're likely super insecure?)
04:35 <__red__> people at work quickly understand that when I start a sentence with "as a penetration tester, I love..." it's bad news bears
04:35 cschneid_ joined
04:35 <thang1> lol, I figured
04:35 <__red__> they're managed by people who are typically... ummm
04:35 <__red__> they assume that they're not a target
04:35 <__red__> and set stpid passwords like
04:36 <__red__> "IBMSUPPORT1"
04:36 <thang1> right, because nobody's gonna target the pharmaceutical running their finances off of a card-punching machine
04:36 nicknovi1 joined
04:36 <thang1> (for specific definitions of "nobody"...)
04:36 <geekosaur> I grant that people who _run_ mainframes are likely to be well behind the times on secure practices. but mainframes have moved on since the 1960s
04:37 <geekosaur> they can still emulate card readers/punches at need but that's not the main way to interact with them
04:37 <thang1> oh the card-punching comment was total sarcasm :p
04:38 <__red__> I've seen banks exposing mainframes to the Internet via FTP
04:38 <thang1> Not even SFTP?
04:39 <__red__> nope
04:39 <geekosaur> not even, sadly
04:39 <__red__> in their defense, the file are encrypted
04:39 <__red__> but arbitrary upload and execution could be a thing
04:39 <* geekosaur> .. has experiences there. the people who run banks are _stupid_
04:39 <thang1> Nice. And of course there's the whole universe of COBOL and FORTRAN code out there
04:40 <thang1> __red__: Are they at least encrypted well? Or is that a loss, too?
04:40 <geekosaur> scientists and engineers still write and run fortran
04:40 <__red__> in the cases I've seen, encrypted well
04:40 <__red__> I have a lot of respect for COBOL and FORTRAN
04:40 <geekosaur> then again I think I'd rather deal with fortran than matlab >.>
04:40 <thang1> Fortran > matlab
04:41 <thang1> I have a lot of respect for Cobol and Fortran as well. They were great for their time, but that time was 30 years ago if not more :p
04:41 takle joined
04:41 <__red__> still better languages than some of the more modern languages I've seen
04:42 <__red__> I mean... honestly
04:42 <geekosaur> there are very, very few compilers for other languages that can match the optimization of good fortran compilers. and you can't make up the difference fully with map/reduce or etc.
04:42 <thang1> I also have a lot of respect for C for being such a pioneer. However I'm quite sad that C got *incredibly* successful. We could be so much further as a technology and industry if people had focused their effort on functional programming, automatic proving theorems, etc
04:42 <__red__> would you trust global banking transactions to node / javascript over COBOL?
04:42 <thang1> geekosaur: oh definitely. Fortran is nuts. And __red__, that's pretty tough...
04:43 <__red__> Me: I'd like to write a javascript application... <runs new app command>
04:43 <geekosaur> __red__, make that haskell and I'd seriously consider it
04:43 <thang1> I'd actually say JS/Node because most COBOL code was constructed using extremely old engineering practices that bound things together so tightly that rewriting was fucking terrible
04:43 <__red__> Me: Why do I have over 10,000 source-code files before I even start?
04:43 <thang1> Now if it's modern programming discipline and code writing practices then COBOL > JS for sure
04:44 <thang1> As long as they keep the OOP out of it -_-
04:44 <__red__> thang1: interesting. My exposure to COBOL/FORTRAN appears to have been more positive than yours.
04:44 <* __red__> hands thang1 the 'gang of four'
04:44 <thang1> __red__: I heard a lot of horror stories. Haven't had a lot of hands on with them :)
04:45 <thang1> I'm also fairly biased against OOP as a programming paradigm because I think it's relatively useless and that all the problems it solves it does so worse than most other solutions out there
04:46 <thang1> I also think "design patterns" are a synonom for "I'm compensating for the weakness in my language and pardigm by inventing new shiny shit to get around its failures"
04:46 <__red__> thang1: in fairness, OOP as it is today is the exact opposite of what the inventor of OOP specified
04:47 <thang1> Exactly. It pisses me off
04:47 <geekosaur> thang1, sometimes they are, sometimes they are not. consider that monads are a design pattern.
04:47 <__red__> OOP was originally message passing and isolation. Not ridiculous amounts of coupling for no reasons.
04:47 <thang1> Erlang is my favorite "OOP"
04:47 <geekosaur> sometimes a design pattern is "how you work _with_ the language", sometimes it is "how you work _around_ the language"
04:47 Levex joined
04:48 <__red__> I think where java went wrong with design patterns is that it created a whole generation of architects that are incapable of designing solutions that aren't made up of those component parts
04:48 <geekosaur> and sometimes it is "how you make sure other programmers understand your code"
04:48 <__red__> and only those parts
04:48 <thang1> True. I don't have many problems with Monads (although they can be abused a bit, like anything else). But that's mostly because Haskell has so few weaknesses in general, "design pattern" wise, that the few ones they use (eg Monads) are acceptable
04:49 <__red__> I do'nt really have a problem with design patterns at all, only their fetishization at the expense of good design;.
04:49 <thang1> Also, None of the Java design patterns are based on mathematial principles
04:49 <thang1> Haskell at least has all of its design patterns rooted in category theory
04:49 <__red__> thang1: that's patently false
04:50 <thang1> Really?
04:50 <__red__> it's based on the mathematical principle of maximising consultant billable hours.
04:50 <thang1> lol
04:50 <thang1> it succeeds admirably there, that's for sure :p
04:50 slomo joined
04:50 slomo joined
04:51 takle joined
04:52 <geekosaur> that's not even working around the language. it's confusing design patterns with design.
04:53 <thang1> Most all of the design patterns in Java explicitly work around a flaw in the language's ability to directly express something
04:53 <thang1> Which is why Lisp has like, zero design patterns whatsoever. You can express everything directly in Lisp with enough macros lol
04:54 <geekosaur> macros _are_ the design pattern there :p
04:56 <thang1> but they're beautiful <3
04:58 takle joined
04:59 <thang1> Also I guess that design patterns would really have to be a bit more rigidly defined to really properly compare haskell vs java
04:59 <thang1> For instance, are foldl/foldr, map, and zippers all considered design patterns?
05:00 <thang1> I wouldn't consider them design patterns at all, but rather just control flow like for loops are.
05:00 <geekosaur> I would consider functional programming style a set of design patterns
05:01 <benzrf> 00:49 <thang1> Haskell at least has all of its design patterns rooted in category theory
05:01 <benzrf> that's not true
05:01 <benzrf> *some* of them are more or less from category theory
05:02 <thang1> geekosaur: so are for-loop, while-loop, and go-tos considered design patterns too?
05:02 <geekosaur> I would say so, yes
05:02 <geekosaur> nobody everr thinks of doing so because they only think of design patterns in the context of OOP
05:03 newhoggy_ joined
05:03 <thang1> I gotta confess, I'm struggling to think of "builder pattern" and "for-loop"as the same category of programming tools (ie, design patterns)
05:04 <geekosaur> well, I would not actually say the for loop itself is a design pattern. but it is at the root of several imperative-style design paterns
05:05 <geekosaur> for example, there's the one for iterators: for (i = base; i; i = i -> next) { ... }
05:05 <geekosaur> hm, why did I use spaces around that
05:05 <geekosaur> that "iterator pattern" is actually supported by macros in *BSD kernel source
05:06 takle joined
05:07 <benzrf> oo
05:07 <thang1> i = base; i; i = i -> next?
05:07 <benzrf> u never write C?
05:07 <geekosaur> that's C
05:07 <monochrom> The single "i" means "i != 0"
05:08 <thang1> that's the part I was missing :p
05:08 <geekosaur> for (initialization; stop test; next step)
05:08 <monochrom> The great unification of pointer, number, and boolean.
05:08 <geekosaur> and yes, it's a common way to test that a pointer is not NULL
05:08 <thang1> Using i instead of i != 0 isn't something that I've seen before
05:08 <geekosaur> very common in C
05:09 <thang1> C programming conventions drive me up a wall sometimes, I swear
05:11 <geekosaur> that kind of thing is why we tend to refer to C as barely having types
05:11 <geekosaur> C grew out of BCPL, which really only had one type (machine word/register word)
05:11 <geekosaur> it still shows
05:13 takle joined
05:18 <thang1> Oh for sure
05:23 newhoggy joined
05:27 kritzcreek joined
05:34 takle joined
05:38 nobodyzxc joined
05:41 takle joined
05:42 dni- joined
05:42 mstruebing joined
05:44 Levex joined
05:48 newhoggy joined
05:55 ridho joined
05:57 hphuoc25 joined
05:57 hphuoc25 joined
05:58 takle joined
06:03 newhoggy joined
06:07 takle joined
06:08 systemfault joined
06:13 newhoggy joined
06:18 takle joined
06:18 newhoggy joined
06:25 newhoggy joined
06:26 takle joined
06:29 Gurkenglas joined
06:30 eatman_ joined
06:35 ralu joined
06:43 dni- joined
06:43 binaryplease joined
06:44 eacameron joined
06:55 takle joined
06:58 blissdev joined
07:00 govg joined
07:02 takle joined
07:09 Geekingfrog joined
07:10 MotherFlojo joined
07:10 eacameron joined
07:12 Pupnik joined
07:12 uglyfigurine joined
07:18 takle joined
07:19 thc202 joined
07:24 takle joined
07:26 govg joined
07:31 bvad joined
07:31 mattyw joined
07:32 takle joined
07:33 newhoggy_ joined
07:45 takle joined
07:47 <eatman_> Hi. I'm still working on the HashLife algorithm (http://www.drdobbs.com/jvm/an-algorithm-for-compressing-space-and-t/184406478) and I'd like to know how is the canonicalization step (bottom) usually done in Haskell.
07:49 <eatman> Ha, didn't see that I was still online...
08:00 Denthir joined
08:05 m1dnight_ joined
08:26 newhoggy joined
08:28 newhoggy joined
08:28 HaskellLord69 joined
08:33 harfangk joined
08:40 newhoggy joined
08:44 nacon joined
08:44 nacon joined
08:44 Durz0 joined
08:50 slomo joined
08:51 cschneid_ joined
08:52 takle joined
08:56 newhoggy joined
08:57 hphuoc25 joined
08:59 zachary12 joined
09:11 eacameron joined
09:13 crave joined
09:20 ali_bush joined
09:20 ali_bush joined
09:30 ephemeral joined
09:46 yellowj joined
10:17 irclogger_com joined
10:17 Topic for
10:17 vaibhavsagar joined
10:27 newhoggy joined
10:33 uglyfigurine joined
10:35 NoCreativity joined
10:35 crave_ joined
10:40 MotherFlojo joined
10:47 newhoggy joined
10:52 cschneid_ joined
10:58 newhoggy_ joined
11:20 hphuoc25 joined
11:38 newhoggy joined
11:45 newhoggy_ joined
11:49 contiver joined
12:02 Denthir joined
12:08 Gurkenglas joined
12:10 newhoggy joined
12:12 eacameron joined
12:14 Iceland_jack joined
12:24 eatman joined
12:28 NoCreativity_ joined
12:32 leothrix joined
12:34 mengu joined
12:44 newhoggy joined
12:52 MotherFlojo joined
12:54 pbrant joined
13:02 <Akii> something interesting I just "solved" http://lpaste.net/355029
13:02 mattyw joined
13:02 <Akii> I was having this function `foo` and even though I cought all possible errors I wouldn't be able to reduce the function signature to express this
13:03 <Akii> hence ending up with `fooRight`
13:03 <Akii> right now I thought that it must be possible to somehow get rid of the MonadError constraint
13:03 <Akii> as I'm obviously handling all the possible failures
13:03 jathan joined
13:03 <Akii> then I wrote `catchError'` which seems to do the trick
13:06 <Akii> now i'm wondering why this doesn't work when I use the MonadError class
13:06 <Akii> instead of the type `ExceptT e m a`
13:08 <Akii> well I think I know why
13:20 mizu_no_oto_work joined
13:23 newhoggy joined
13:39 newhoggy joined
13:39 ebw joined
13:40 <ebw> How to send a cat picture to bribe people? Or should I just send the link?
13:40 <Akii> yup, just links
13:41 <Akii> here are some lambdacats for you https://spl.smugmug.com/Humor/Lambdacats/
13:42 <ebw> lol thanks
13:42 <Akii> gotta love KittehT
13:44 <Akii> but if you have any questions: just ask
13:49 <ebw> I actually have difficulties with the error messages of ghci, especially if you just missed a '=' or similar like in http://lpaste.net/355004 notice the otherwise branch.
13:50 <ebw> I just get a parse error, but nothing which hints towards the specific error, like 'missing =' is that just something I have to live with in haskell or do I need to turn on a compiler flag or something?
13:52 <Akii> indentation is strange with this one
13:52 <ebw> I wrote it and I am an absolute beginner :)
13:52 <Akii> otherwise = d : go n
13:53 <Akii> you forgot the =
13:53 <Akii> ah that was not the question :D
13:53 <ebw> yes
13:53 <ebw> question was about quality of error messages :)
13:53 <Akii> well they're usually no parse errors
13:53 chlong joined
13:53 ephemeral joined
13:54 <Akii> what I mean is: I rarely have parse errors
13:54 <Akii> and the type errors can be cryptic but most of the time it's ok I think
13:55 <Akii> also I think they're working on better compile errors
14:00 Levex joined
14:01 peterson joined
14:11 <ebw> ok, thanks I try to forget letters more rarely
14:13 iAmerikan joined
14:13 eacameron joined
14:14 newhoggy joined
14:16 <ebw> You stated that my indentation is weird, would you please show me a version that is more like it is usually done?
14:17 sigmundv_ joined
14:19 <Akii> what I meant is this nesting of `where` looked strange at first sight
14:20 Gurkenglas joined
14:21 chlong joined
14:21 newhoggy joined
14:22 eacameron joined
14:26 <lpaste> Akii annotated “haskellbook ch 7” with “haskellbook ch 7 (annotation)” at http://lpaste.net/355004#a355034
14:27 newhoggy joined
14:27 <Akii> I find this more readable, but it can surely be improved upon
14:29 <lpaste> Akii revised “haskellbook ch 7 (annotation)”: “haskellbook ch 7 (annotation)” at http://lpaste.net/355034
14:32 newhoggy joined
14:33 <ebw> I agree that it is more readable. (Ofc I have to change the calling site to keep wordNumber sematics.)
14:33 <Akii> well I just extracted the function go, now that I look at it
14:34 <ebw> Is it conventional to place a new line between the guard lines and the where line?
14:35 <Iceland_jack> For me it depends on how long the line is already
14:35 <Akii> there, my final solution: http://lpaste.net/355036
14:36 <ebw> Ok thanks.
14:37 carlomagno joined
14:38 newhoggy joined
14:40 carlomagno1 joined
14:42 petru joined
14:46 newhoggy joined
14:49 <p123> hello, I don't know how to use 'map' properly in order to generate a list with infinity elements .. There are more details. For those willing to give a helping hand, there is a link to the question on stackoverflow: http://stackoverflow.com/questions/43674103/generate-list-of-employees-haskell
14:50 carlomagno joined
14:52 newhoggy joined
14:53 MotherFlojo joined
14:54 <Iceland_jack> I haven't read your question p123, but map preserves the length its input list
14:54 <Iceland_jack> so it won't 'generate' an infinite list without being passed an infinite list
14:55 conal joined
14:56 newhoggy_ joined
14:57 <p123> Iceland_jack, I know that.. it was the only way to describe my question in 3-5 words without entering in too many details :)
14:57 <p123> it's more like a combination between recursion and map
14:57 uglyfigurine joined
14:59 peterbecich joined
14:59 uglyfigurine joined
15:02 cschneid_ joined
15:09 newhoggy joined
15:18 p123 left
15:19 szymon joined
15:22 newhoggy joined
15:23 averell joined
15:28 newhoggy joined
15:29 kadoban joined
15:36 eacameron joined
15:39 newhoggy_ joined
15:40 takle joined
15:44 mattyw joined
15:47 uglyfigurine joined
15:48 newhoggy joined
15:53 newhoggy joined
15:54 eacameron joined
15:59 ysgard joined
16:00 geekosaur joined
16:03 geekosaur joined
16:06 newhoggy joined
16:07 hvr joined
16:09 mattyw joined
16:10 eacameron joined
16:13 meandi_2 joined
16:15 newhoggy joined
16:15 eacameron joined
16:16 baweaver left
16:17 chlong joined
16:19 mizu_no_oto_work joined
16:19 newhoggy joined
16:20 eacameron joined
16:20 galderz joined
16:23 malaclyps joined
16:26 newhoggy joined
16:27 eacameron joined
16:33 newhoggy joined
16:36 conal joined
16:39 newhoggy joined
16:39 ebw joined
16:40 abhiroop joined
16:43 <abhiroop> I am trying to factor out my recursion using `fix`
16:43 <abhiroop> I have this: http://lpaste.net/355038
16:43 <abhiroop> How do I express the `fixPointFunc` using fix
16:44 <glguy> fixPointFunc = fix $ \rec x c -> .... in concat (map (rec j) parent)
16:44 <glguy> incidentally we have concatMap
16:44 <glguy> parent == [] is null parent
16:46 <glguy> Also, instead of M.lookup and a case you can use: M.findWithDefault []
16:47 newhoggy joined
16:48 jship joined
16:53 newhoggy joined
16:57 <abhiroop> glguy: Thanks incorporated all your pointers
16:57 geekosaur joined
16:57 <abhiroop> But running into a type error:
16:57 <benzrf> this is haskell, we don't have pointers :o
16:57 <abhiroop> http://lpaste.net/355041
16:58 <abhiroop> benzrf: :D
16:58 <abhiroop> Is the sole purpose of `fix` just to avoid explicit recursion?
16:58 newhoggy joined
16:59 <benzrf> i wouldn't say that, but it *is* true that anything you can write with fix you can also write with recursion
16:59 <benzrf> however, anything you write with `map' you can write by recursing over a list, but i wouldn't say that the sole purpose of map is to avoid explicit recursion
17:00 <abhiroop> In case of `map` I can see the pattern being abstracted out
17:00 <abhiroop> I couldn't find many solid examples in `fix`
17:00 <abhiroop> Any examples?
17:02 <benzrf> ah
17:02 Levex joined
17:02 <benzrf> well, ok - it's true that fix isn't used all that often in practice :)
17:02 <benzrf> so maybe your description of fix is closer to what it is than my description of map is to map
17:02 <benzrf> but it's elegant to write things like:
17:02 <benzrf> > fix (1:)
17:02 <lambdabot> [1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1,1...
17:04 newhoggy joined
17:06 Uniaika joined
17:07 <glguy> abhiroop: You forgot to remove the parameters from the definition of fixPointFunc
17:07 <glguy> abhiroop: Look at what I wrote again
17:07 newhoggy_ joined
17:08 <abhiroop> glguy: Ooops! Thanks :)
17:11 <benzrf> abhiroop: fun fact: fix computes the "least" fixed point of a function, where "least" is with respect to an ordering that puts "more-defined" things above "less-defined" things
17:12 <benzrf> e.g., undefined or an infinite loop is at the bottom, whereas a fully terminating term like "Just 3" would be maximal
17:12 <benzrf> so if an infinite loop expression is a fix point of "f", then "fix f" will also be an infinite loop
17:12 <benzrf> but if not...
17:12 newhoggy joined
17:13 <abhiroop> Wow! thats amazingly powerful"where "least" is with respect to an ordering that puts "more-defined" things above "less-defined" things"
17:14 <benzrf> well, kinda
17:14 <benzrf> so like for example, "(1:) undefined" is not the same as "undefined", so "fix (1:)" will not be undefined
17:15 <benzrf> in general, anything which forces its argument will have undefined as a fix point, because the argument gets forced so the whole thing is undefined
17:15 <benzrf> so "fix" only does interesting stuff on functions that don't necessarily force their argument
17:15 <benzrf> but that should already be obvious from its definition :)
17:17 nacon joined
17:17 nacon joined
17:23 iAmerikan joined
17:29 Rodya_ joined
17:29 newhoggy joined
17:31 govg joined
17:32 malaclyps joined
17:34 newhoggy_ joined
17:39 newhoggy joined
17:43 Levex joined
17:49 newhoggy joined
17:53 meandi_2 joined
17:54 newhoggy joined
18:00 abhiroop joined
18:00 <monochrom> The same can be said about recursion too.
18:01 <monochrom> "x = 0:x" works on Haskell and breaks on SML.
18:02 nicknovi1 joined
18:03 uglyfigurine joined
18:03 newhoggy joined
18:05 uglyfigurine joined
18:08 jrm joined
18:08 newhoggy joined
18:16 pilne joined
18:17 newhoggy joined
18:18 Levex joined
18:19 <ski> # let rec x = 0 :: x;;
18:19 <ski> val x : int list =
18:19 <ski> [0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0; 0;
18:19 <ski> ...]
18:20 <monochrom> That's OCaml and I'm aware of that.
18:20 sigmundv_ joined
18:21 <monochrom> Furthermore it proves my point because OCaml makes a special case of "if the RHS starts with a constructor, we go lazy".
18:22 <ski> well .. not lazy, exactly
18:23 <ski> but if the recursive references are on a path of only constructors (including record constructions), or behind an abstraction, then it will patch the former of those up, after constructing the value
18:23 zero_byte joined
18:23 <ski> so .. you can't abstract that
18:28 newhoggy joined
18:30 iAmerikan joined
18:37 newhoggy joined
18:43 takle joined
18:45 newhoggy joined
18:46 crave joined
18:46 mehs joined
18:46 delexi joined
18:50 newhoggy joined
18:51 abhiroop_ joined
18:51 takle joined
18:54 MotherFlojo joined
18:56 newhoggy joined
18:58 carlomagno1 joined
19:04 newhoggy joined
19:12 abhiroop_ joined
19:16 newhoggy joined
19:21 slentzen joined
19:24 Deide joined
19:25 newhoggy joined
19:25 abhiroop joined
19:30 newhoggy joined
19:32 iAmerikan joined
19:37 Rodya_ joined
19:40 Rodya_ joined
19:42 newhoggy joined
19:42 abhiroop joined
19:47 newhoggy joined
19:56 newhoggy joined
19:57 geekosaur joined
20:02 newhoggy joined
20:06 abhiroop joined
20:08 MotherFlojo joined
20:12 Gurkenglas joined
20:18 newhoggy joined
20:18 abhiroop joined
20:25 newhoggy joined
20:30 newhoggy joined
20:37 abhiroop joined
20:38 newhoggy joined
20:45 newhoggy joined
20:47 argent0 joined
20:50 govg joined
20:50 newhoggy joined
20:59 newhoggy joined
21:04 newhoggy joined
21:10 zero_byte joined
21:14 Rodya_ joined
21:15 newhoggy joined
21:20 newhoggy joined
21:26 newhoggy joined
21:31 newhoggy joined
21:32 geekosaur joined
21:34 abhiroop joined
21:41 newhoggy joined
21:44 iAmerikan joined
21:46 redcedar joined
21:47 newhoggy joined
21:52 newhoggy joined
21:52 hiratara joined
21:57 aarvar joined
21:57 newhoggy_ joined
22:01 iAmerikan joined
22:04 kadoban joined
22:05 newhoggy joined
22:16 newhoggy joined
22:21 flounders joined
22:21 newhoggy joined
22:25 seanpars1 joined
22:30 newhoggy joined
22:31 hiratara joined
22:32 hphuoc25 joined
22:34 newhoggy_ joined
22:35 abhiroop joined
22:43 newhoggy joined
22:46 peterbecich joined
22:51 argent0 joined
22:58 newhoggy joined
23:02 peterbecich joined
23:02 takle joined
23:03 newhoggy joined
23:06 taksuyu joined
23:10 petru joined
23:15 seanparsons joined
23:21 takle joined
23:23 geekosaur joined
23:24 newhoggy joined
23:25 peterbecich joined
23:25 abhiroop joined
23:29 newhoggy joined
23:34 yellowj joined
23:54 eacameron joined