VegOwOtenks

joined 8 months ago
[โ€“] VegOwOtenks 2 points 3 weeks ago (1 children)

Does the IORef go upwards the recursion tree? If you modify the IORef at some depth of 15, does the calling function also receive the update, is there also a Non-IO-Ref?

[โ€“] VegOwOtenks 2 points 3 weeks ago (3 children)

Haskell

Sometimes I want something mutable, this one takes 0.3s, profiling tells me 30% of my time is spent creating new objects. :/

import Control.Arrow

import Data.Map.Strict (Map)

import qualified Data.Map.Strict as Map
import qualified Data.Maybe as Maybe

type StoneCache = Map Int Int
type BlinkCache = Map Int StoneCache

parse :: String -> [Int]
parse = lines >>> head >>> words >>> map read

memoizedCountSplitStones :: BlinkCache -> Int -> Int -> (Int, BlinkCache)
memoizedCountSplitStones m 0 _ = (1, m)
memoizedCountSplitStones m i n 
        | Maybe.isJust maybeMemoized = (Maybe.fromJust maybeMemoized, m)
        | n == 0     = do
                let (r, rm) = memoizedCountSplitStones m (pred i) (succ n)
                let rm' = cacheWrite rm i n r
                (r, rm')
        | digitCount `mod` 2 == 0 = do
                let (r1, m1) = memoizedCountSplitStones m  (pred i) firstSplit
                let (r2, m2) = memoizedCountSplitStones m1 (pred i) secondSplit
                let m' = cacheWrite m2 i n (r1+r2)
                (r1 + r2, m')
        | otherwise = do
                let (r, m') = memoizedCountSplitStones m (pred i) (n * 2024)
                let m'' = cacheWrite m' i n r
                (r, m'')
        where
                secondSplit    = n `mod` (10 ^ (digitCount `div` 2))
                firstSplit     = (n - secondSplit) `div` (10 ^ (digitCount `div` 2))
                digitCount     = succ . floor . logBase 10 . fromIntegral $ n
                maybeMemoized  = cacheLookup m i n

foldMemoized :: Int -> (Int, BlinkCache) -> Int -> (Int, BlinkCache)
foldMemoized i (r, m) n = (r + r2, m')
        where
                (r2, m') = memoizedCountSplitStones m i n

cacheWrite :: BlinkCache -> Int -> Int -> Int -> BlinkCache
cacheWrite bc i n r = Map.adjust (Map.insert n r) i bc

cacheLookup :: BlinkCache -> Int -> Int -> Maybe Int
cacheLookup bc i n = do
        sc <- bc Map.!? i
        sc Map.!? n

emptyCache :: BlinkCache
emptyCache = Map.fromList [ (i, Map.empty) | i <- [1..75]]

part1 = foldl (foldMemoized 25) (0, emptyCache)
        >>> fst
part2 = foldl (foldMemoized 75) (0, emptyCache)
        >>> fst

main = getContents
        >>= print
        . (part1 &&& part2)
        . parse
[โ€“] VegOwOtenks 1 points 3 weeks ago

Thank you for the link, this is crazy!

[โ€“] VegOwOtenks 2 points 3 weeks ago

Haskell

Cool task, nothing to optimize

import Control.Arrow

import Data.Array.Unboxed (UArray)
import Data.Set (Set)

import qualified Data.Char as Char
import qualified Data.List as List
import qualified Data.Set as Set
import qualified Data.Array.Unboxed as UArray

parse :: String -> UArray (Int, Int) Int
parse s = UArray.listArray ((1, 1), (n, m)) . map Char.digitToInt . filter (/= '\n') $ s
        where
                n = takeWhile (/= '\n') >>> length $ s
                m = filter (== '\n') >>> length >>> pred $ s

reachableNeighbors :: (Int, Int) -> UArray (Int, Int) Int -> [(Int, Int)]
reachableNeighbors p@(py, px) a = List.filter (UArray.inRange (UArray.bounds a))
        >>> List.filter ((a UArray.!) >>> pred >>> (== (a UArray.! p)))
        $ [(py-1, px), (py+1, px), (py, px-1), (py, px+1)]

distinctTrails :: (Int, Int) -> UArray (Int, Int) Int -> Int
distinctTrails p a
        | a UArray.! p == 9 = 1
        | otherwise = flip reachableNeighbors a
                >>> List.map (flip distinctTrails a)
                >>> sum
                $ p

reachableNines :: (Int, Int) -> UArray (Int, Int) Int -> Set (Int, Int)
reachableNines p a
        | a UArray.! p == 9 = Set.singleton p
        | otherwise = flip reachableNeighbors a
                >>> List.map (flip reachableNines a)
                >>> Set.unions
                $ p

findZeros = UArray.assocs
        >>> filter (snd >>> (== 0))
        >>> map fst

part1 a = findZeros
        >>> map (flip reachableNines a)
        >>> map Set.size
        >>> sum
        $ a
part2 a = findZeros
        >>> map (flip distinctTrails a)
        >>> sum
        $ a

main = getContents
        >>= print
        . (part1 &&& part2)
        . parse
[โ€“] VegOwOtenks 2 points 3 weeks ago

Maths degree at least explains the choice of language

[โ€“] VegOwOtenks 2 points 3 weeks ago (3 children)

Thank you for trying, oh well. Maybe we are simply at the limits.

[โ€“] VegOwOtenks 2 points 3 weeks ago (1 children)

Trees are a poor mans Sets and vice versa .-.

[โ€“] VegOwOtenks 2 points 3 weeks ago* (last edited 3 weeks ago)

I only now found your edit after I had finished my previous comment. I think splitting into two lists may be good: one List of Files and one of Empty Blocks, I think this may not work with your checksumming so maybe not.

[โ€“] VegOwOtenks 2 points 3 weeks ago (7 children)

Thank you for the detailed explanation!, it made me realize that our solutions are very similar. Instead of keeping a Dict[Int, List[Int]] where the value list is ordered I have a Dict[Int, Tree[Int]] which allows for easy (and fast!) lookup due to the nature of trees. (Also lists in haskell are horrible to mutate)

I also apply the your technique of only processing each file once, instead of calculating the checksum afterwards on the entire list of file blocks I calculate it all the time whenever I process a file. Using some maths I managed to reduce the sum to a constant expression.

[โ€“] VegOwOtenks 2 points 3 weeks ago (2 children)

It will always be a wonder to me how you manage to do so much in so few lines, even your naive solution only takes a few seconds to run. ๐Ÿคฏ

[โ€“] VegOwOtenks 2 points 3 weeks ago (11 children)

So cool, I was very hyped when I managed to squeeze out the last bit of performance, hope you are too. Especially surprised you managed it with python, even without the simple tricks like trees ;)

I wanted to try it myself, can confirm it runs in under 0.1s in performance mode on my laptop, I am amazed though I must admin I don't understand your newest revision. ๐Ÿ™ˆ

[โ€“] VegOwOtenks 4 points 3 weeks ago* (last edited 3 weeks ago)

Haskell

This was fun, I optimized away quite a bit, as a result it now runs in 0.04s for both parts together on my 2016 laptop.

In part 1 I just run through the array with a start- and an end-index whilst summing up the checksum the entire time.
In part 2 I build up Binary Trees of Free Space which allow me to efficiently search for and insert free spaces when I start traversing the disk from the back. Marking the moved files as free is omitted because the checksum is calculated for every file that is moved or not moved directly.

Code

import Control.Monad
import Data.Bifunctor

import Control.Arrow hiding (first, second)

import Data.Map (Map)
import Data.Set (Set)
import Data.Array.Unboxed (UArray)

import qualified Data.Map as Map
import qualified Data.Set as Set
import qualified Data.Ord as Ord
import qualified Data.List as List
import qualified Data.Char as Char
import qualified Data.Maybe as Maybe
import qualified Data.Array.Unboxed as UArray

toNumber = flip (-) (Char.ord '0') <<< Char.ord 

type FileID = Int
type FileLength = Int
type DiskPosition = Int
type File = (FileID, (DiskPosition, FileLength))
type EmptyMap = Map FileLength (Set DiskPosition)

readDisk :: DiskPosition -> [(Bool, FileLength)] -> [(Bool, (DiskPosition, FileLength))]
readDisk _ [] = []
readDisk o ((True, l):fs)  = (True, (o, l))  : readDisk (o+l) fs
readDisk o ((False, l):fs) = (False, (o, l)) : readDisk (o+l) fs

parse2 :: String -> ([File], EmptyMap)
parse2 s = takeWhile (/= '\n')
        >>> map toNumber
        >>> zip (cycle [True, False]) -- True is File, False is empty
        >>> readDisk 0
        >>> List.partition fst
        >>> join bimap (map snd)
        >>> first (zip [0..])
        >>> first List.reverse
        >>> second (filter (snd >>> (/= 0)))
        >>> second (List.sortOn snd)
        >>> second (List.groupBy (curry $ (snd *** snd) >>> uncurry (==)))
        >>> second (List.map (snd . head &&& map fst))
        >>> second (List.map (second Set.fromDistinctAscList))
        >>> second Map.fromDistinctAscList
        $ s

maybeMinimumBy :: (a -> a -> Ordering) -> [a] -> Maybe a
maybeMinimumBy _ [] = Nothing
maybeMinimumBy f as = Just $ List.minimumBy f as

fileChecksum fid fpos flen = fid * (fpos * flen + ((flen-1) * (flen-1) + (flen-1)) `div` 2)

type Checksum = Int
moveFilesAccumulate :: (Checksum, EmptyMap) -> File -> (Checksum, EmptyMap)
moveFilesAccumulate (check, spaces) (fid, (fpos, flen)) = do
        let bestFit = Map.map (Set.minView)
                >>> Map.toList
                >>> List.filter (fst >>> (>= flen))
                >>> List.filter (snd >>> Maybe.isJust)
                >>> List.map (second Maybe.fromJust) -- [(FileLength, (DiskPosition, Set DiskPosition))]
                >>> List.filter (snd >>> fst >>> (< fpos))
                >>> maybeMinimumBy (\ (_, (p, _)) (_, (p', _)) -> Ord.compare p p')
                $ spaces

        case bestFit of
                Nothing -> (check + fileChecksum fid fpos flen, spaces)
                Just (spaceLength, (spacePosition, remainingSet)) -> do
                        

                        -- remove the old empty entry by replacing the set
                        let updatedMap  = Map.update (const $! Just remainingSet) spaceLength spaces

                        -- add the remaining space, if any
                        let remainingSpace = spaceLength - flen
                        let remainingSpacePosition = spacePosition + flen
                        let updatedMap' = if remainingSpace == 0 then updatedMap else Map.insertWith (Set.union) remainingSpace (Set.singleton remainingSpacePosition) updatedMap

                        (check + fileChecksum fid spacePosition flen, updatedMap')

parse1 :: String -> UArray Int Int
parse1 s = UArray.listArray (0, sum lengthsOnly - 1) blocks
        where
                lengthsOnly = filter (/= '\n')
                        >>> map toNumber
                        $ s :: [Int]
                blocks = zip [0..]
                        >>> List.concatMap (\ (index, n) -> if index `mod` 2 == 0 then replicate n (index `div` 2) else replicate n (-1))
                        $ lengthsOnly :: [Int]

moveBlocksAccumulate :: Int -> Int -> UArray Int Int -> Int
moveBlocksAccumulate start stop array
        | start      == stop   = if startBlock == -1 then 0 else start * startBlock
        | start      >  stop   = 0
        | stopBlock  == -1     = moveBlocksAccumulate start (stop - 1) array
        | startBlock == -1     = movedChecksum + moveBlocksAccumulate (start + 1) (stop - 1) array
        | startBlock /= -1     = startChecksum + moveBlocksAccumulate (start + 1) stop array
        where
                startBlock    = array UArray.! start
                stopBlock     = array UArray.! stop
                movedChecksum = stopBlock * start
                startChecksum = startBlock * start

part1 a = moveBlocksAccumulate 0 arrayLength a
        where
                (_, arrayLength) = UArray.bounds a
part2 (files, spaces) = foldl moveFilesAccumulate (0, spaces)
        >>> fst
        $ files

main = getContents
        >>= print
        . (part1 . parse1 &&& part2 . parse2)

view more: โ€น prev next โ€บ