<    June 2018     >
Su Mo Tu We Th Fr Sa  
                1  2  
 3  4  5  6  7  8  9  
10 11 12 13 14 15 16  
17 18 19 20 21 _2_2 23  
24 25 26 27 28 29 30
00:03 conal_ joined
00:04 jtcs joined
00:04 abhiroop joined
00:12 abhiroop joined
00:13 mizu_no_oto_work joined
00:16 abhiroop joined
00:17 dogweather joined
00:26 dogweather joined
00:27 conal joined
00:30 MomusInvictus joined
00:33 vurtz joined
00:33 abhiroop joined
00:35 MomusInvictus joined
00:38 abhiroop joined
00:42 abhiroop joined
00:43 Cthalupa joined
00:45 iakritas joined
00:47 lazyinitialized joined
00:48 lazyinitialized joined
00:49 vurtz joined
00:51 comerijn joined
00:55 conal joined
00:57 andreabedini joined
01:05 andyhoang joined
01:11 vurtz joined
01:12 <smichel17> Hey. I'm trying to read a 2d array from stdin (one line per row, space separated). I've got that down, but the catch is all rows must be the same length (whatever length the first row is). Here's my current code:
01:13 <smichel17> https://paste.gnome.org/pjjbupims
01:14 <smichel17> Specifically, if it reads a line of the wrong length, it should prompt the user to re-enter that line and then call itself again, dropping whatever input it got last time
01:14 <smichel17> That part is easy
01:14 <smichel17> The part I'm struggling with is how to write this idiomatically (ie, without duplicating code)
01:15 <monochrom> You will have to duplicate code but it's very little duplication.
01:15 louispan joined
01:15 <smichel17> I need to read the first line to know how long it is, so I can't make a wrapper outside readMatrix and pass the first length
01:16 <monochrom> Have a second recursive function that takes a number as parameter and reads the 2nd to last lines and check their sizes and what you said about prompting the user.
01:17 Nik05 joined
01:18 <smichel17> Yeah, I suppose that's what I'll have to do. I was hoping I could leave the `read -> parse -> recurse` part intact, but oh well
01:18 <smichel17> Thanks :)
01:19 <monochrom> Because from the 2nd line onwards it's "(read -> parse -> check length -> recurse) -> recurse" so it's really a different control flow.
01:20 <smichel17> Yeah, I thought maybe I could do it with a clever let or guard
01:30 louispan joined
01:34 dogweather joined
01:50 dogweather joined
01:57 mbonneau joined
01:57 mud joined
01:59 louispan joined
02:06 andyhoang joined
02:08 abhiroop joined
02:15 Linter joined
02:18 dogweather joined
02:18 andyhoang joined
02:28 dogweather joined
02:34 dogweather joined
02:34 patlv joined
02:34 andyhoang joined
02:40 louispan joined
02:42 andyhoang joined
02:44 mizu_no_oto_work joined
02:51 jsondavis joined
02:56 conal joined
02:57 mizu_no_oto_work joined
03:01 dogweather joined
03:02 ToffeeYogurtPots joined
03:03 AetherWind joined
03:05 bbrodriguez joined
03:08 louispan joined
03:16 dogweather joined
03:21 kapil___ joined
03:36 dogweather joined
03:42 louispan joined
03:49 wotan__ joined
03:49 dogweather joined
03:57 dogweather joined
04:00 AetherWind_GJ joined
04:02 Linter joined
04:04 pfurla joined
04:04 taumuon joined
04:05 hphuoc25 joined
04:25 dogweather joined
04:40 hphuoc25_ joined
04:47 Ariakenom joined
04:50 howdoi joined
04:51 kroomey joined
05:06 pfurla joined
05:06 dogweather joined
05:09 pfurla joined
05:12 Cthalupa joined
05:19 slomo joined
05:27 obi_jan_kenobi__ joined
05:28 dogweather joined
05:34 Cthalupa joined
05:37 dogweather joined
05:44 Linter joined
05:45 Gurkenglas joined
05:59 louispan joined
06:20 louispan joined
06:27 hphuoc25 joined
06:36 dogweather joined
06:37 merijn joined
06:42 louispan joined
06:44 hamishmack joined
06:48 hphuoc25 joined
06:52 merijn joined
07:00 dogweather joined
07:07 <Squarism> Im curious if theres any work going further than DuplicateRecordFields? It feels as if its not enough and people will continue to prefix field names to avoid collisions. When having your your 1000 line tests setting up your nested record field datastructures the bloat of prefixes and also the lens artifact of "_" really gets in the way.
07:08 dadabidet joined
07:09 <dminuoso> Squarism: https://ghc.haskell.org/trac/ghc/wiki/ExtensibleRecords covers some of the discussion bits of it.
07:09 <dminuoso> Squarism: I found that classy lenses can help ellivate some of the pain.
07:10 <Squarism> dminuoso, thanks. Havent heard of "classy lenses" =D
07:11 hphuoc25 joined
07:12 <dminuoso> Squarism: Also take note of `makeFieldsNoPrefix` when using lenses with DuplicateRecordFields
07:13 remyhr joined
07:13 dogweather joined
07:14 <Squarism> okok
07:15 <dminuoso> Squarism: Or you can even use makeLensesWith manually and pick the rules you want. :)
07:19 cur8or joined
07:19 AetherWind left
07:20 <Squarism> dminuoso, The first line of the document makes me calm anyway : "There seems to be widespread agreement that the current situation with regards to records is ​unacceptable"
07:20 Pupnik joined
07:21 <Squarism> Seems like its more of a management problem than lack of insight
07:23 <dminuoso> Squarism: Based on the discussions I've read it's slightly more complicated than that. The better solutions break backwards compatibiltiy, the others dont seem to be promising.
07:23 zakora joined
07:25 <dminuoso> Squarism: But there are libraries you could use. Im just not too familiar with any of them to tell you more.
07:25 <Squarism> hmm ok. Im happy it has attention and there is a consensus its a problem.
07:26 <Squarism> dminuoso, Im at work now but ill give this some reading when I get time.
07:28 Linter joined
07:35 merijn joined
07:35 agander joined
07:38 thc202 joined
07:39 Kongaloosh joined
07:41 fabsn joined
07:42 ackthet joined
07:48 Cthalupa joined
07:51 qu1j0t3 joined
07:52 ThomasLocke joined
08:00 Ahmedkh joined
08:04 zero_byte joined
08:05 Cthalupa joined
08:18 <infty> Given f::A->B->C and g::C->D, one can produce g#f::A->B->D by applying g to the output of f. I can define the operator (#) using curry and uncurry, but does there exist a version of this operator in base or Prelude? Otherwise, how do you handle this type of situation?
08:19 <dminuoso> infty: There's a combinator: (.) . (.)
08:19 <dminuoso> Sometimes written `(fmap . fmap)` or aliased as (.:)
08:20 <dminuoso> :t (.)
08:20 Ahmedkh left
08:20 <lambdabot> (b -> c) -> (a -> b) -> a -> c
08:20 <dminuoso> :t (.) . (.)
08:20 <lambdabot> (b -> c) -> (a1 -> a2 -> b) -> a1 -> a2 -> c
08:20 <dminuoso> :t (.) . (.) . (.)
08:20 <lambdabot> (b -> c) -> (a1 -> a2 -> a3 -> b) -> a1 -> a2 -> a3 -> c
08:20 <dminuoso> etc
08:21 <infty> Oh I see
08:22 <infty> I need a moment to wrap my head around why this works, but I love this kind of esoteric recursive formulas
08:22 <infty> Happens a lot with haskell :D
08:23 <dminuoso> infty: In my opinion the fmap/Functor route is the simplest to get an intuition. The raw answer can be found by simply taking the definition of (.) and substituting.
08:29 agander_ joined
08:37 grdryn joined
08:37 curious_corn joined
08:41 dxld joined
08:44 louispan joined
08:52 cur8or joined
09:03 agander joined
09:04 curious_corn joined
09:05 Folkol joined
09:15 hphuoc25 joined
09:18 cur8or_ joined
09:18 bergle joined
09:29 <infty> I got through the raw substitution, yay! I'll come back to this after I have learned about functors.
09:29 <infty> dminuoso: Thanks for your help!
09:34 louispan joined
09:37 hphuoc25 joined
09:39 louispan joined
09:40 curious_corn joined
09:48 louispan joined
09:50 <greeny_> i get a headache by trying to understand why (.) . (.) works
09:50 <dminuoso> greeny_: Do you understand the Functor instance of ((->) a)?
09:52 <greeny_> i guess
09:52 <dminuoso> greeny_: How would you map over the numbers with (+1) in some `l :: [[Int]]`
09:53 <greeny_> fmap (+1) [1..10]
09:53 <dminuoso> greeny_. Good. What if you have a nested list like in the type signature?
09:54 rzp joined
09:54 <greeny_> another fmap. fmap (fmap (+1)) [[1..10]]
09:55 <greeny_> :t fmap (fmap (+1)) [[1..10]]
09:55 <lambdabot> (Enum b, Num b) => [[b]]
09:55 <dminuoso> greeny_: Great. Use function composition instead of nesting fmap applications.
09:56 hphuoc25 joined
09:57 <greeny_> uff
09:57 <greeny_> difficult
09:57 <dminuoso> greeny_: very simple :)
09:57 <dminuoso> @src (.)
09:57 <lambdabot> (f . g) x = f (g x)
09:57 Linter joined
09:58 <dminuoso> greeny_: You can read this as a statement of equality. Whenver you have something of the shape `f (g x)` you can write it as `(f . g) x` instead
09:58 <dminuoso> Or the other way around. They are interchangable.
09:58 <greeny_> ok give me a second to figure it out
09:59 curious_corn joined
10:00 <greeny_> ah ok; (fmap . fmap) (+1) [[1..10]]
10:00 <dminuoso> Great! :)
10:01 <dminuoso> Can you guess what the case would be for a list inside a list inside a list? That is a thrice nested list
10:01 <dminuoso> Or better than guessing, determine it. :)
10:01 <greeny_> :t (fmap . fmap . fmap) (+1)
10:01 <lambdabot> (Num b, Functor f3, Functor f2, Functor f1) => f1 (f2 (f3 b)) -> f1 (f2 (f3 b))
10:01 <dminuoso> Very nice.
10:01 <dminuoso> So `fmap` lifts a function to act on a functor
10:02 machinedgod joined
10:02 <dminuoso> well, on a *container of sorts.
10:02 <dminuoso> (fmap . fmap) lifts a function to act on a doubly nested *container of sorts
10:02 <dminuoso> and so forth
10:02 <dminuoso> greeny_: Does this make sense so far?
10:02 <greeny_> yes
10:03 <dminuoso> greeny_: what is the implementation for instance Functor ((->) a)?
10:05 <greeny_> (.) ?
10:05 <dminuoso> Right. So in a way `fmap` lets you "map over the result of a function"
10:05 <dminuoso> (fmap . fmap) lets you "map over the result of the result of a function"
10:06 <dminuoso> (fmap . fmap . fmap) lets you "map over the result of the result of the result of a function"
10:06 <dminuoso> and because fmap = (.) for this particular type, its equivalent to just (.) . (.) . (.)
10:06 <dminuoso> Which incidentally is equivalent to `fmap fmap fmap fmap fmap`
10:06 <greeny_> ok now i get it
10:07 <greeny_> thx for your time explaining it to me
10:07 <dminuoso> You are welcome :)
10:11 hphuoc25 joined
10:12 <cppxor2arr> function composition <3
10:16 curious_corn joined
10:16 dogweather joined
10:21 louispan joined
10:28 hphuoc25 joined
10:48 matsurago joined
11:04 emilypi joined
11:13 louispan joined
11:19 hphuoc25 joined
11:21 glguy joined
11:22 ices joined
11:28 curious_corn joined
11:35 malaclyps joined
11:35 vurtz joined
11:40 tvN joined
11:41 tvN left
11:49 b-b joined
11:54 agander joined
12:01 andyhoang joined
12:02 louispan joined
12:10 patlv joined
12:10 cur8or joined
12:12 carlomagno joined
12:13 lumm joined
12:14 andreabedini joined
12:17 slomo joined
12:17 slomo joined
12:28 Linter joined
12:39 remyhr joined
12:40 jsondavis joined
12:43 cur8or joined
12:44 hphuoc25 joined
12:44 5EXAA7D68 joined
12:44 dogweather joined
12:45 eskimag joined
12:51 p0lyph3m joined
12:59 curious_corn joined
13:09 m1dnight1 joined
13:13 dogweather joined
13:17 pbrant joined
13:27 agander joined
13:32 emilypi joined
13:48 dogweather joined
13:49 dogweath_ joined
13:54 carlomagno joined
14:00 pfurla joined
14:15 curious_corn joined
14:18 kapil___ joined
14:24 dogweather joined
14:46 abhiroop joined
14:56 merijn joined
14:57 <cppxor2arr> which is more readable?
14:57 <cppxor2arr> map ((`mod` n) . (+1) . (^2)) xs
14:57 <cppxor2arr> or
14:57 <cppxor2arr> map (\x -> (x^2+1) `mod` n) xs
14:58 Linter joined
14:58 <shapr> I like the second, is n defined outside the lambda?
14:59 <cppxor2arr> yes
15:02 fabsn joined
15:03 hphuoc25 joined
15:05 <shapr> cppxor2arr: although I use <$> more than map the past year or two
15:05 <shapr> > (+1) <$> [9,7..1]
15:05 <lambdabot> [10,8,6,4,2]
15:06 <cppxor2arr> oh nice
15:06 <* cppxor2arr> goes to learn some more to understand
15:07 <shapr> cppxor2arr: how'd you get started learning Haskell?
15:08 <cppxor2arr> i heard about haskell and how it was concise and elegant. tried learning and got hooked
15:08 <shapr> cool :-)
15:08 <shapr> I was trying to find a more concise and elegant way to write Python, someone said it looked like I was trying to write Haskell in Python
15:08 <shapr> so I looked up Haskell
15:08 <shapr> got hooked
15:09 <cppxor2arr> yeah i thought i found the most concise language when i learned python. until now :-)
15:10 <shapr> this is one good free resource http://www.cis.upenn.edu/~cis194/spring13/lectures.html
15:10 <dminuoso> I can second CIS194. It's a great course to learn by.
15:10 <cppxor2arr> thanks. will read it
15:11 <shapr> cppxor2arr: got any more questions? you're gonna find so much awesome :-)
15:11 <cppxor2arr> i can tell :) i've been excited every day to learn haskell recently
15:12 <cppxor2arr> i'll probably pop back in here to ask some questions
15:12 <shapr> yeah, show up anytime
15:13 bbrodriguez joined
15:14 <* shapr> hops cheerfully
15:14 <* cppxor2arr> goes to sleep cheerfully
15:14 <shapr> I did a bunch of Haskell webdev this past week, should lead to some "how to do things" blog posts
15:14 <cppxor2arr> link?
15:14 <shapr> blog posts aren't up yet
15:15 <shapr> but I'm rebooting my ancient blog: https://shapr.github.io/
15:15 <cppxor2arr> look forward to see it
15:16 <shapr> :-D
15:16 <dminuoso> cppxor2arr: Here's my experience with Haskell: I just started hacking on cabal a day ago. Based on the properties that Haskell enjoys I could immediately jump into a module, refactor some code with ease, and be done.
15:17 <cppxor2arr> i heard about how easy it is to refactor haskell code without knowing what the code really does
15:17 <dminuoso> Yup =)
15:17 <dminuoso> Well you have to understand the code locally of course.
15:18 <cppxor2arr> yeah
15:18 <dminuoso> But you dont need to grasp the full call trees. It's enough to just look at the types of the functions that code uses to get a good idea.
15:19 <dminuoso> No hidden dependencies, no hidden mutable state, no hidden side effects
15:19 <cppxor2arr> <3 haskell
15:21 chrisdotcode joined
15:25 hphuoc25 joined
15:26 lainon joined
15:32 curious_corn joined
15:32 owiecc joined
15:35 lumm joined
15:35 hvr joined
15:36 cschneid joined
15:46 patlv joined
15:48 cur8or joined
15:49 remyhr joined
15:51 obi_jan_kenobi__ joined
15:51 remyhr joined
15:52 comerijn joined
15:56 <nitrix> cppxor2arr: <$> is a generalization of map. `map` works only on lists, but there are other data structures that just so happens to have similar properties, those that implement the Functor typeclass.
15:57 <nitrix> > (+1) <$> Just 42
15:57 <lambdabot> Just 43
15:57 <cppxor2arr> nitrix: sounds like an abstraction. excited to learn about it!
15:57 <nitrix> > (+1) <$> (10, 0)
15:57 <lambdabot> (10,1)
15:57 <cppxor2arr> oh wow
15:57 <* cppxor2arr> is dazzled
15:57 <nitrix> > (+1) <$> [1,2,3]
15:57 <lambdabot> [2,3,4]
15:58 <* cppxor2arr> faints
15:58 <cppxor2arr> this is so cool
15:58 iAmerikan joined
15:59 <nitrix> Works on binary trees and whatnot. Yeah, it's pretty great :)
15:59 <cppxor2arr> > (+1) <$> (1,2,3)
15:59 <lambdabot> error:
15:59 <lambdabot> • No instance for (Functor ((,,) Integer Integer))
15:59 <lambdabot> arising from a use of ‘e_11123’
15:59 <cppxor2arr> hehe
16:00 <nitrix> Your type (Integer, Integer, Integer) doesnt have a default Functor instance, but you could write one.
16:01 <cppxor2arr> It isn't `Integral a => a`?
16:01 Gurkenglas joined
16:02 <nitrix> Even more general :) Integer literals will have the type `Num a => a`. I was trying to to confuse you :P
16:02 <cppxor2arr> :P
16:05 <nitrix> @let instance Functor ((,,) Integer Integer) where fmap f (x,y,z) = (x, y, f z)
16:05 <lambdabot> Defined.
16:06 <cppxor2arr> > (+1) <$> (1,2,3)
16:06 <nitrix> > (+1) <$> (1, 2, 3) :: (Integer, Integer, Integer)
16:06 <lambdabot> (1,2,4)
16:06 <lambdabot> (1,2,4)
16:06 <cppxor2arr> ha
16:07 <nitrix> The reason why only the last one can be mapped is an exercise left to the reader :P
16:08 <cppxor2arr> definitely need to learn more first
16:09 <nitrix> It requires a bit more type juggling to make it work for the other members :P
16:09 <nitrix> Where are you at in your learnings :D ?
16:09 bbrodriguez joined
16:10 abhiroop joined
16:10 <cppxor2arr> haven't learned algebraic types, functors, applicatives, monads yet
16:10 <cppxor2arr> and the likes
16:11 <nitrix> Yeah you most likely want Data types and Algebraic Data Types first.
16:11 <nitrix> Then learn about Type Classes and start getting an intuition about the ones that exists.
16:12 <cppxor2arr> will do
16:12 <nitrix> Haskell's learning curve is a bit painful for that, since right off the bat, you'll probably be exposed to types that are fairly general, like `Num a => a`, but you can limit yourself to familar concrete types until then :P
16:13 <nitrix> Things like Int, Bool, String, whatever you used to have in your other language :]
16:13 vurtz joined
16:14 <cppxor2arr> i think i got the hang of the general types though
16:15 <cppxor2arr> Integral (Int, Integer). Floating (Float, Double). Num (Integral, Floating) ?
16:21 <nitrix> Integral, Floating and Num are type classes :)
16:22 <nitrix> Actually, I guess people do get introduced very early to those.
16:22 <nitrix> :t (+)
16:22 <lambdabot> Num a => a -> a -> a
16:22 <cppxor2arr> so type clssses include one type or more?
16:24 <nitrix> They don't quite "include", it's slightly reversed. Type Classes define an interface that the types must satisfyif they want belong in that group.
16:24 <nitrix> *if
16:24 Azel joined
16:25 pie_ joined
16:25 <cppxor2arr> Is it like mathmatics?
16:25 <cppxor2arr> Natural numbers and integers are rational numbersm
16:25 <nitrix> For example, the Num type class defines (+), (-), (*), so types like Int and Double and others will need to provide an implementation (instance of that type class) for them to belong in that group.
16:25 curious_corn joined
16:26 <nitrix> Because they provide an implementation for those operations, it's what makes you able to use (+) on any of those types.
16:26 <cppxor2arr> oh that makes sense
16:27 Linter joined
16:27 <nitrix> @let class Speaker s where speak :: s -> String
16:27 <lambdabot> Defined.
16:28 <nitrix> @let data Cat = Cat; data Dog = Dog;
16:28 <lambdabot> Defined.
16:28 <cppxor2arr> @type Cat
16:28 <lambdabot> Cat
16:28 <nitrix> @let instance Speaker Cat where speak = "Meow!"
16:28 <lambdabot> .L.hs:192:17: error:
16:28 <lambdabot> • Couldn't match expected type ‘Cat -> String’
16:28 <lambdabot> with actual type ‘[Char]’
16:28 <nitrix> @let instance Speaker Cat where speak _ = "Meow!"
16:28 slomo joined
16:28 slomo joined
16:28 <lambdabot> Defined.
16:28 <nitrix> @let instance Speaker Dog where speak _ = "Woof!"
16:28 <lambdabot> Defined.
16:29 <nitrix> > speak Dog
16:29 <lambdabot> "Woof!"
16:29 <nitrix> > speak Cat
16:29 <lambdabot> "Meow!"
16:29 <cppxor2arr> wow cool
16:29 <cppxor2arr> how long do these stay defined? for lambdabot
16:29 conal joined
16:30 <nitrix> The benefit is that anyone can come up with new types and implement the interfaces that corresponds to what the type is able to do.
16:30 <nitrix> cppxor2arr: Forever, until someone clears them with @undefine
16:30 <cppxor2arr> ok
16:30 <nitrix> Which happens quite frequently :P
16:31 <nitrix> cppxor2arr: By the way, the Cat above is a unary function, so basically a constant. It takes no arguments and produces something of type Cat.
16:31 <nitrix> Confusing because I named them both the same.
16:32 <nitrix> data Bool = True | False
16:32 <nitrix> That's an easier example.
16:32 <cppxor2arr> oh i wanted to ask this
16:32 <nitrix> :t True
16:32 <lambdabot> Bool
16:32 <cppxor2arr> (-5) is just what?... and (5-) is a function
16:32 <cppxor2arr> isthe (-) in the first one a function?
16:33 <nitrix> cppxor2arr: (5-) is a partial function that'll subtract its argument from 5.
16:33 <cppxor2arr> what about (-5)? is the (-) just not a function?
16:33 <nitrix> And you'd think (-5) is also a partial function that'd subtract 5 from a number, except it isn't.
16:34 <cppxor2arr> yesh that's `subtract 5` righr?
16:34 <nitrix> We needed a way to represent negative numbers, so (-5) is actually just the number -5.
16:34 <glguy> - in (-5) is 'negate' from the 'Num' typeclass
16:34 <glguy> -5 is (negate 5)
16:34 <nitrix> Yeah, you have to do negate for that.
16:34 <cppxor2arr> ah ok
16:34 <cppxor2arr> @src negate
16:34 <lambdabot> negate x = 0 - x
16:34 Ariakenom joined
16:36 <glguy> There's actually an extension in GHC to support negative integer literals directly, but that's not the normal behavior
16:39 <cppxor2arr> question
16:39 <cppxor2arr> why does `f n = foldl1 (.) . replicate n` work
16:39 <cppxor2arr> but not `f = foldl1 (.) . replicate` ?
16:42 <glguy> foldl1 (.) . replicate ===> \x -> foldl1 (.) (replicate x)
16:43 <glguy> now you're applying (foldl1 (.)) to a function instead of a list
16:44 fabsn joined
16:47 <cppxor2arr> with type annotations can't it be `\x y -> foldl1 (.) (replicate x y)` ?
16:47 slomo joined
16:47 slomo joined
16:47 <glguy> No, (.) only has one definition
16:47 <glguy> ?src (.)
16:47 <lambdabot> (f . g) x = f (g x)
16:48 <cppxor2arr> aha!
16:49 merijn joined
16:50 <nitrix> @let (.:) = (.) . (.)
16:50 <lambdabot> Defined.
16:50 <nitrix> :t (.:)
16:50 <lambdabot> (b -> c) -> (a1 -> a2 -> b) -> a1 -> a2 -> c
16:50 <nitrix> Ah, still wrong.
16:51 <nitrix> Nevermind, that should work.
16:52 <nitrix> You can compose a function that accepts 1 argument with another function that accepts two arguments with this, but it's probably better to not try to be too clever :)
16:53 <* cppxor2arr> is trying to psrse this
16:55 <nitrix> At reads as:
16:55 <nitrix> It reads as:
16:57 <nitrix> (.::) :: (tmp -> final) -> (input1 -> input2 -> tmp) -> (input1 -> input2 -> final)
16:57 jsondavis joined
16:58 <nitrix> I meant (.:), I can't type x]
16:59 <nitrix> @let magical = (*10) .: (+)
16:59 <lambdabot> Defined.
16:59 <nitrix> > magical 1 2
16:59 <lambdabot> 30
17:00 <cppxor2arr> that's more intuitive now
17:00 <nitrix> Point-free function composition is scary enough, this is stretching it for most people.
17:00 <nitrix> I hate reading code like that :)
17:00 <nitrix> But as demonstrated, it can go a long way.....
17:10 Arcaelyx joined
17:11 <cppxor2arr> :t (.:) .: (.:)
17:11 <lambdabot> (b1 -> c) -> (b2 -> a1 -> b1) -> (a3 -> a4 -> b2) -> a3 -> a4 -> a1 -> c
17:19 <nitrix> @let (.::::) = (.:) .: (.:)
17:19 <lambdabot> Defined.
17:20 <cppxor2arr> oh my
17:20 <nitrix> You're making it difficult :P
17:23 <nitrix> I can't think of a use case :]
17:24 conal joined
17:25 Gurkenglas joined
17:26 louispan joined
17:28 <nitrix> > isOdd .:::: negate (*) 5 5 1
17:28 <lambdabot> error:
17:28 <lambdabot> Variable not in scope: isOdd :: Integer -> c
17:28 <nitrix> > odd .:::: negate (*) 5 5 1
17:28 <lambdabot> error:
17:28 <lambdabot> • No instance for (Typeable a20)
17:28 <lambdabot> arising from a use of ‘show_M441922269276143044227306’
17:28 <nitrix> Aww :(
17:30 <nitrix> Anyway, it applies (*) on 5 and 5, then the result is given to `negate`, to which we give the `1`, and that result it given to `odd` which gives us True, but I think I did something wrong.
17:30 <nitrix> My brain hurts too :P
17:33 <nitrix> > odd .:::: subtract (*) 5 5 1
17:33 <lambdabot> error:
17:33 <lambdabot> • No instance for (Typeable a20)
17:33 <lambdabot> arising from a use of ‘show_M438395528117798440027383’
17:34 <nitrix> > (odd .:::: subtract) (*) 5 5 1
17:34 <lambdabot> False
17:34 <nitrix> What a monstruosity.
17:34 <* cppxor2arr> goes to sleep with his mind boggled
17:35 <nitrix> cppxor2arr: One can always build abstractions on top of abstractions ad infinitum. You have to stop somewhere, ideally when it stops being useful.
17:35 <cppxor2arr> yeah for sure
17:36 <cppxor2arr> sounds like d/t, d/t^2, d/t^3
17:37 <nitrix> Let me know how the learnings go c:
17:37 <cppxor2arr> yup gotta sleep now
17:43 Deide joined
17:53 merijn joined
18:05 lainon joined
18:05 emmanuel_erc joined
18:05 <emmanuel_erc> hello there
18:09 <emmanuel_erc> I just solved one of the problems on hackerrank (https://www.hackerrank.com/challenges/sam-and-substrings/problem). When I rewrite certain parts of my code to be more, ostensibly, efficient, I get serious performance issues. Here is the link (https://lpaste.net/5732083981969522688) to my submission.
18:10 <emmanuel_erc> My question why I do get serious performance inefficiences, when I for example change the scanl1 portion, to a hand written loop? Or even better rewrite the entire solution into a hand-written loop?
18:10 Big_G joined
18:13 huxing joined
18:15 replay joined
18:16 Tspoon_ joined
18:29 bbrodriguez joined
18:31 abhiroop joined
18:32 patlv joined
18:35 mizu_no_oto_work joined
18:36 Cale joined
18:42 ratsch joined
18:44 ratschance joined
18:47 lainon joined
18:49 conal joined
18:51 emilypi joined
18:54 comerijn joined
18:58 Linter joined
19:01 skeet70 joined
19:02 abhiroop joined
19:05 agander joined
19:15 herr_jth joined
19:17 Arcaelyx joined
19:17 agander joined
19:18 lumm joined
19:19 patlv joined
19:19 andyhoang joined
19:20 agander_ joined
19:25 pfurla joined
19:30 <joel135> emmanuel_erc: it may be that you remove binders?
19:30 <joel135> > let f n = case n of { 0 -> 0; 1 -> 1; _ -> f (n-1) + f (n-1)} in f 20
19:30 <lambdabot> 524288
19:30 <joel135> let f n = case n of { 0 -> 0; 1 -> 1; _ -> let m = f (n-1) in m + m} in f 20
19:31 <joel135> the second one will be much faster
19:31 lumm_ joined
19:33 abhiroop joined
19:34 kapil___ joined
19:39 dogweather joined
19:40 mounty joined
19:40 lumm joined
19:42 twopoint718 joined
19:42 abhiroop joined
19:42 <emmanuel_erc> joel135: where are the binders in my code?
19:43 <emmanuel_erc> I'm a little confused by what you mean also? If you can excuse me
19:43 <joel135> the code you posted at lpaste is the quicker version, right?
19:44 <emmanuel_erc> yes it is the fastest version that I've written
19:44 <emmanuel_erc> I tried writing this vector
19:44 <emmanuel_erc> but I understand why the list version would be faster
19:44 <emmanuel_erc> I think
19:45 mounty joined
19:48 <joel135> i am imagining that scanl1 uses a binding for the accumulated value and maybe you are removing it when you make your handwritten version
19:49 <joel135> could you post a slower version?
19:49 <emmanuel_erc> sure
19:50 Ariakenom joined
19:51 lumm joined
19:53 <emmanuel_erc> I just tried a handwritten version, it was fine
19:53 <emmanuel_erc> I'm not sure what the issue was then, but an earlier version was correct but a lot slower
19:54 merijn joined
19:54 <emmanuel_erc> I mean it is slightly slower
19:55 <emmanuel_erc> I don't think I need to post it
19:55 <joel135> ok
19:55 <emmanuel_erc> but what you were saying is correct, ghc is doing some magic there
19:58 lumm joined
20:00 lumm joined
20:00 <joel135> it is an aspect of lazy evaluation
20:01 mizu_no_oto_work joined
20:02 <emmanuel_erc> I do have a previous solution that is intolerably slow and wasteful. The funny thing is that I thought was being clever. I'm not entirely sure why this one would be go kaput (here is the link: https://lpaste.net/8286289724251832320)
20:03 delexi joined
20:07 <joel135> it is very quick for me - how big input does it need to become too slow?
20:08 <jared-w> which one is the quick one for you, joel135? Are you talking about the intolerably slow one?
20:08 <joel135> yes
20:08 <emmanuel_erc> One of the inputs was 199002 characters long
20:08 <jared-w> nice
20:08 <joel135> ok let me try that
20:09 <jared-w> I'd suspect that the foldr is blowing things up
20:09 <jared-w> (or rather the fact that your accummulator is not strict in the slow one and it is in the fast one)
20:09 <emmanuel_erc> ah
20:09 <emmanuel_erc> that would do it
20:10 <jared-w> throw a few bangs on the slow nonsense and see if that fixes the speed :)
20:11 <emmanuel_erc> as soon as I put even one bang, I get <<loop>>
20:11 <emmanuel_erc> I don't think foldr is a viable option here
20:12 <jared-w> hah, that's funny
20:12 dogweather joined
20:13 <emmanuel_erc> I would have thought the only real thunk that would matter would the thunk for the length of the list (l)
20:13 <emmanuel_erc> but nvm I guess
20:14 abhiroop joined
20:15 whitephoenix joined
20:17 <joel135> they are about as slow for me
20:19 striapach joined
20:19 <emmanuel_erc> I suppose that using laziness in that sense is limited to either small inputs or small expressions
20:21 koki joined
20:23 Ariakenom joined
20:26 p0lyph3m joined
20:31 iAmerikan joined
20:31 <joel135> you can gain a lot from using the laws of modular arithmetic in this problem
20:34 <joel135> https://lpaste.net/8720930823945060352
20:36 drewr joined
20:39 <joel135> oops i removed something i shouldn't have in `main = ...`
20:41 <joel135> there https://lpaste.net/9075937122526953472
20:42 vurtz joined
20:45 abhiroop joined
20:52 kaictl joined
20:54 comerijn joined
20:55 lainon joined
21:01 abhiroop joined
21:02 mstruebing1 joined
21:15 <emmanuel_erc> joel135: yeah, you're right
21:15 <emmanuel_erc> thanks
21:16 <emmanuel_erc> actually when I had the functions %+ and %* to the slow function I wrote, I start getting good performance
21:17 <joel135> that's cool
21:19 <emmanuel_erc> though it does use more memory than the stricter version (unsurprisingly I suppose)
21:19 <joel135> yes i noticed at some point that one of the solutions needed 15 GB of ram
21:19 <joel135> that's not cool
21:20 <joel135> :P
21:20 <emmanuel_erc> yeah, one of them actually requires 29MB on my machine
21:21 <emmanuel_erc> I think the better figure to look at is maximum residency
21:21 <emmanuel_erc> if you are doing +RTS -s -RTS
21:21 <joel135> what is "+RTS -s -RTS"?
21:22 <emmanuel_erc> It does not seem to me that this problem should require so much memory.
21:22 pfurla joined
21:23 <emmanuel_erc> like if you were to compile your program and then run the executable, (as in ./program), then you could instead write (./program +RTS -s -RTS) and you'll see some profiling information.
21:23 <joel135> i didn't know about that
21:25 <emmanuel_erc> Yeah, I am not as familiar with what every piece of information that report gives as I would want. But maximum residency is the better figure to look at, if you want to the largest amount of memory used at any particular point in time.
21:25 <emmanuel_erc> total memory, collects all bytes used over the lifetime of the program
21:25 <joel135> ok i just used my system monitor
21:25 <emmanuel_erc> so that can be larger than you'd think it to be
21:26 <emmanuel_erc> which distro are you using? ubuntu?
21:26 <joel135> arch with kde
21:27 <emmanuel_erc> oh, kde comes with a system monitor, right?
21:27 <joel135> yes
21:27 abhiroop joined
21:27 Linter joined
21:29 replay joined
21:34 hphuoc25 joined
21:41 foobarbah joined
21:44 rzp joined
21:45 louispan joined
21:48 conal joined
21:54 merijn joined
22:02 bergle2 joined
22:02 lainon joined
22:05 striapach_ joined
22:06 tdjones joined
22:07 tdjones joined
22:13 lumm joined
22:14 bbrodr joined
22:15 abhiroop joined
22:16 aarvar joined
22:21 lainon joined
22:32 hamishmack joined
22:33 carlomagno joined
22:38 cschnei__ joined
22:45 abhiroop joined
22:48 abhiroop joined
22:55 comerijn joined
22:57 zero_byte joined
22:59 dogweather joined
23:00 lainon joined
23:04 Linter joined
23:13 dogweather joined
23:20 _ikke_ joined
23:25 abhiroop joined
23:26 dogweather joined
23:27 iAmerikan joined
23:32 jsondavis joined
23:35 patlv joined
23:43 pfurla joined
23:44 patlv joined
23:44 dogweather joined
23:46 _ikke_ joined
23:47 whitephoenix joined
23:50 louispan joined
23:55 merijn joined