Co-authored-by: Michael Peyton Jones <me@michaelpj.com>
This commit is contained in:
Ruslan Gadeev 2022-11-09 20:42:12 +03:00 committed by GitHub
parent 7c0201b509
commit 5d56aa70a8
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
50 changed files with 126 additions and 126 deletions

View File

@ -206,5 +206,5 @@ This returns an error in HLS if `tasty-discover` is not in the path: `could not
Due to some limitations in the interaction between HLS and `stack`, there are [issues](https://github.com/haskell/haskell-language-server/issues/366) in projects with multiple components (i.e. a main library and executables, test suites or benchmarks): Due to some limitations in the interaction between HLS and `stack`, there are [issues](https://github.com/haskell/haskell-language-server/issues/366) in projects with multiple components (i.e. a main library and executables, test suites or benchmarks):
- The project has to be built succesfully *before* loading it with HLS to get components other than the library work. - The project has to be built successfully *before* loading it with HLS to get components other than the library work.
- Changes in the library are not automatically propagated to other components, especially in the presence of errors in the library. So you have to restart HLS in order for those components to be loaded correctly. The usual symptom is the editor showing errors like `Could not load module ...` or `Cannot satisfy -package ...`. - Changes in the library are not automatically propagated to other components, especially in the presence of errors in the library. So you have to restart HLS in order for those components to be loaded correctly. The usual symptom is the editor showing errors like `Could not load module ...` or `Cannot satisfy -package ...`.

View File

@ -20,7 +20,7 @@
number of iterations. There is ample room for improvement: number of iterations. There is ample room for improvement:
- Statistical analysis to detect outliers and auto infer the number of iterations needed - Statistical analysis to detect outliers and auto infer the number of iterations needed
- GC stats analysis (currently -S is printed as part of the experiment) - GC stats analysis (currently -S is printed as part of the experiment)
- Analyisis of performance over the commit history of the project - Analysis of performance over the commit history of the project
How to run: How to run:
1. `cabal exec cabal run ghcide-bench -- -- ghcide-bench-options` 1. `cabal exec cabal run ghcide-bench -- -- ghcide-bench-options`

View File

@ -3,8 +3,8 @@
* Progress reporting improvements (#1784) - Pepe Iborra * Progress reporting improvements (#1784) - Pepe Iborra
* Unify session loading using implicit-hie (#1783) - fendor * Unify session loading using implicit-hie (#1783) - fendor
* Fix remove constraint (#1578) - Kostas Dermentzis * Fix remove constraint (#1578) - Kostas Dermentzis
* Fix wrong extend import while type constuctor and data constructor have the same name (#1775) - Lei Zhu * Fix wrong extend import while type constructor and data constructor have the same name (#1775) - Lei Zhu
* Imporve vscode extension schema generation (#1742) - Potato Hatsue * Improve vscode extension schema generation (#1742) - Potato Hatsue
* Add hls-graph abstracting over shake (#1748) - Neil Mitchell * Add hls-graph abstracting over shake (#1748) - Neil Mitchell
* Tease apart the custom SYB from ExactPrint (#1746) - Sandy Maguire * Tease apart the custom SYB from ExactPrint (#1746) - Sandy Maguire
* fix class method completion (#1741) - Lei Zhu * fix class method completion (#1741) - Lei Zhu

View File

@ -362,7 +362,7 @@ runWithDb recorder fp k = do
withHieDb fp $ \writedb -> do withHieDb fp $ \writedb -> do
-- the type signature is necessary to avoid concretizing the tyvar -- the type signature is necessary to avoid concretizing the tyvar
-- e.g. `withWriteDbRetrable initConn` without type signature will -- e.g. `withWriteDbRetryable initConn` without type signature will
-- instantiate tyvar `a` to `()` -- instantiate tyvar `a` to `()`
let withWriteDbRetryable :: WithHieDb let withWriteDbRetryable :: WithHieDb
withWriteDbRetryable = makeWithHieDbRetryable recorder rng writedb withWriteDbRetryable = makeWithHieDbRetryable recorder rng writedb

View File

@ -187,10 +187,10 @@ typecheckModule (IdeDefer defer) hsc tc_helpers pm = do
tcRnModule session tc_helpers $ demoteIfDefer pm{pm_mod_summary = mod_summary''} tcRnModule session tc_helpers $ demoteIfDefer pm{pm_mod_summary = mod_summary''}
let errorPipeline = unDefer . hideDiag dflags . tagDiag let errorPipeline = unDefer . hideDiag dflags . tagDiag
diags = map errorPipeline warnings diags = map errorPipeline warnings
deferedError = any fst diags deferredError = any fst diags
case etcm of case etcm of
Left errs -> return (map snd diags ++ errs, Nothing) Left errs -> return (map snd diags ++ errs, Nothing)
Right tcm -> return (map snd diags, Just $ tcm{tmrDeferedError = deferedError}) Right tcm -> return (map snd diags, Just $ tcm{tmrDeferredError = deferredError})
where where
demoteIfDefer = if defer then demoteTypeErrorsToWarnings else id demoteIfDefer = if defer then demoteTypeErrorsToWarnings else id
@ -494,7 +494,7 @@ mkHiFileResultCompile se session' tcm simplified_guts = catchErrs $ do
writeBinCoreFile fp core_file writeBinCoreFile fp core_file
-- We want to drop references to guts and read in a serialized, compact version -- We want to drop references to guts and read in a serialized, compact version
-- of the core file from disk (as it is deserialised lazily) -- of the core file from disk (as it is deserialised lazily)
-- This is because we don't want to keep the guts in memeory for every file in -- This is because we don't want to keep the guts in memory for every file in
-- the project as it becomes prohibitively expensive -- the project as it becomes prohibitively expensive
-- The serialized file however is much more compact and only requires a few -- The serialized file however is much more compact and only requires a few
-- hundred megabytes of memory total even in a large project with 1000s of -- hundred megabytes of memory total even in a large project with 1000s of
@ -503,7 +503,7 @@ mkHiFileResultCompile se session' tcm simplified_guts = catchErrs $ do
pure $ assert (core_hash1 == core_hash2) pure $ assert (core_hash1 == core_hash2)
$ Just (core_file, fingerprintToBS core_hash2) $ Just (core_file, fingerprintToBS core_hash2)
-- Verify core file by rountrip testing and comparison -- Verify core file by roundtrip testing and comparison
IdeOptions{optVerifyCoreFile} <- getIdeOptionsIO se IdeOptions{optVerifyCoreFile} <- getIdeOptionsIO se
case core_file of case core_file of
Just (core, _) | optVerifyCoreFile -> do Just (core, _) | optVerifyCoreFile -> do
@ -773,7 +773,7 @@ generateHieAsts hscEnv tcm =
-- These varBinds use unitDataConId but it could be anything as the id name is not used -- These varBinds use unitDataConId but it could be anything as the id name is not used
-- during the hie file generation process. It's a workaround for the fact that the hie modules -- during the hie file generation process. It's a workaround for the fact that the hie modules
-- don't export an interface which allows for additional information to be added to hie files. -- don't export an interface which allows for additional information to be added to hie files.
let fake_splice_binds = Util.listToBag (map (mkVarBind unitDataConId) (spliceExpresions $ tmrTopLevelSplices tcm)) let fake_splice_binds = Util.listToBag (map (mkVarBind unitDataConId) (spliceExpressions $ tmrTopLevelSplices tcm))
real_binds = tcg_binds $ tmrTypechecked tcm real_binds = tcg_binds $ tmrTypechecked tcm
#if MIN_VERSION_ghc(9,0,1) #if MIN_VERSION_ghc(9,0,1)
ts = tmrTypechecked tcm :: TcGblEnv ts = tmrTypechecked tcm :: TcGblEnv
@ -801,8 +801,8 @@ generateHieAsts hscEnv tcm =
#endif #endif
#endif #endif
spliceExpresions :: Splices -> [LHsExpr GhcTc] spliceExpressions :: Splices -> [LHsExpr GhcTc]
spliceExpresions Splices{..} = spliceExpressions Splices{..} =
DL.toList $ mconcat DL.toList $ mconcat
[ DL.fromList $ map fst exprSplices [ DL.fromList $ map fst exprSplices
, DL.fromList $ map fst patSplices , DL.fromList $ map fst patSplices
@ -812,7 +812,7 @@ spliceExpresions Splices{..} =
] ]
-- | In addition to indexing the `.hie` file, this function is responsible for -- | In addition to indexing the `.hie` file, this function is responsible for
-- maintaining the 'IndexQueue' state and notfiying the user about indexing -- maintaining the 'IndexQueue' state and notifying the user about indexing
-- progress. -- progress.
-- --
-- We maintain a record of all pending index operations in the 'indexPending' -- We maintain a record of all pending index operations in the 'indexPending'
@ -1409,7 +1409,7 @@ instance NFData IdeLinkable where
ml_core_file :: ModLocation -> FilePath ml_core_file :: ModLocation -> FilePath
ml_core_file ml = ml_hi_file ml <.> "core" ml_core_file ml = ml_hi_file ml <.> "core"
-- | Retuns an up-to-date module interface, regenerating if needed. -- | Returns an up-to-date module interface, regenerating if needed.
-- Assumes file exists. -- Assumes file exists.
-- Requires the 'HscEnv' to be set up with dependencies -- Requires the 'HscEnv' to be set up with dependencies
-- See Note [Recompilation avoidance in the presence of TH] -- See Note [Recompilation avoidance in the presence of TH]
@ -1437,7 +1437,7 @@ loadInterface session ms linkableNeeded RecompilationInfo{..} = do
-- The source is modified if it is newer than the destination (iface file) -- The source is modified if it is newer than the destination (iface file)
-- A more precise check for the core file is performed later -- A more precise check for the core file is performed later
let sourceMod = case mb_dest_version of let sourceMod = case mb_dest_version of
Nothing -> SourceModified -- desitination file doesn't exist, assume modified source Nothing -> SourceModified -- destination file doesn't exist, assume modified source
Just dest_version Just dest_version
| source_version <= dest_version -> SourceUnmodified | source_version <= dest_version -> SourceUnmodified
| otherwise -> SourceModified | otherwise -> SourceModified
@ -1466,7 +1466,7 @@ loadInterface session ms linkableNeeded RecompilationInfo{..} = do
Just (old_hir, _) Just (old_hir, _)
| isNothing linkableNeeded || isJust (hirCoreFp old_hir) | isNothing linkableNeeded || isJust (hirCoreFp old_hir)
-> do -> do
-- Peform the fine grained recompilation check for TH -- Perform the fine grained recompilation check for TH
maybe_recomp <- checkLinkableDependencies get_linkable_hashes (hsc_mod_graph sessionWithMsDynFlags) (hirRuntimeModules old_hir) maybe_recomp <- checkLinkableDependencies get_linkable_hashes (hsc_mod_graph sessionWithMsDynFlags) (hirRuntimeModules old_hir)
case maybe_recomp of case maybe_recomp of
Just msg -> do_regenerate msg Just msg -> do_regenerate msg
@ -1478,7 +1478,7 @@ loadInterface session ms linkableNeeded RecompilationInfo{..} = do
let runtime_deps let runtime_deps
| not (mi_used_th iface) = emptyModuleEnv | not (mi_used_th iface) = emptyModuleEnv
| otherwise = parseRuntimeDeps (md_anns details) | otherwise = parseRuntimeDeps (md_anns details)
-- Peform the fine grained recompilation check for TH -- Perform the fine grained recompilation check for TH
maybe_recomp <- checkLinkableDependencies get_linkable_hashes (hsc_mod_graph sessionWithMsDynFlags) runtime_deps maybe_recomp <- checkLinkableDependencies get_linkable_hashes (hsc_mod_graph sessionWithMsDynFlags) runtime_deps
case maybe_recomp of case maybe_recomp of
Just msg -> do_regenerate msg Just msg -> do_regenerate msg
@ -1598,7 +1598,7 @@ coreFileToLinkable linkableType session ms iface details core_file t = do
--- and leads to fun errors like "Cannot continue after interface file error". --- and leads to fun errors like "Cannot continue after interface file error".
getDocsBatch getDocsBatch
:: HscEnv :: HscEnv
-> Module -- ^ a moudle where the names are in scope -> Module -- ^ a module where the names are in scope
-> [Name] -> [Name]
#if MIN_VERSION_ghc(9,3,0) #if MIN_VERSION_ghc(9,3,0)
-> IO [Either String (Maybe [HsDoc GhcRn], IntMap (HsDoc GhcRn))] -> IO [Either String (Maybe [HsDoc GhcRn], IntMap (HsDoc GhcRn))]

View File

@ -155,7 +155,7 @@ data TcModuleResult = TcModuleResult
, tmrTypechecked :: TcGblEnv , tmrTypechecked :: TcGblEnv
, tmrTopLevelSplices :: Splices , tmrTopLevelSplices :: Splices
-- ^ Typechecked splice information -- ^ Typechecked splice information
, tmrDeferedError :: !Bool , tmrDeferredError :: !Bool
-- ^ Did we defer any type errors for this module? -- ^ Did we defer any type errors for this module?
, tmrRuntimeModules :: !(ModuleEnv ByteString) , tmrRuntimeModules :: !(ModuleEnv ByteString)
-- ^ Which modules did we need at runtime while compiling this file? -- ^ Which modules did we need at runtime while compiling this file?

View File

@ -782,7 +782,7 @@ ghcSessionDepsDefinition fullModSummary GhcSessionDepsConfig{..} env file = do
let inLoadOrder = map (\HiFileResult{..} -> HomeModInfo hirModIface hirModDetails Nothing) ifaces let inLoadOrder = map (\HiFileResult{..} -> HomeModInfo hirModIface hirModDetails Nothing) ifaces
#if MIN_VERSION_ghc(9,3,0) #if MIN_VERSION_ghc(9,3,0)
-- On GHC 9.4+, the module graph contains not only ModSummary's but each `ModuleNode` in the graph -- On GHC 9.4+, the module graph contains not only ModSummary's but each `ModuleNode` in the graph
-- also points to all the direct descendents of the current module. To get the keys for the descendents -- also points to all the direct descendants of the current module. To get the keys for the descendants
-- we must get their `ModSummary`s -- we must get their `ModSummary`s
!final_deps <- do !final_deps <- do
dep_mss <- map msrModSummary <$> uses_ GetModSummaryWithoutTimestamps deps dep_mss <- map msrModSummary <$> uses_ GetModSummaryWithoutTimestamps deps
@ -950,7 +950,7 @@ getModIfaceRule recorder = defineEarlyCutoff (cmapWithPrio LogShake recorder) $
hiDiags <- case hiFile of hiDiags <- case hiFile of
Just hiFile Just hiFile
| OnDisk <- status | OnDisk <- status
, not (tmrDeferedError tmr) -> liftIO $ writeHiFile se hsc hiFile , not (tmrDeferredError tmr) -> liftIO $ writeHiFile se hsc hiFile
_ -> pure [] _ -> pure []
return (fp, (diags++hiDiags, hiFile)) return (fp, (diags++hiDiags, hiFile))
NotFOI -> do NotFOI -> do
@ -1022,9 +1022,9 @@ regenerateHiFile sess f ms compNeeded = do
wDiags <- forM masts $ \asts -> wDiags <- forM masts $ \asts ->
liftIO $ writeAndIndexHieFile hsc se (tmrModSummary tmr) f (tcg_exports $ tmrTypechecked tmr) asts source liftIO $ writeAndIndexHieFile hsc se (tmrModSummary tmr) f (tcg_exports $ tmrTypechecked tmr) asts source
-- We don't write the `.hi` file if there are defered errors, since we won't get -- We don't write the `.hi` file if there are deferred errors, since we won't get
-- accurate diagnostics next time if we do -- accurate diagnostics next time if we do
hiDiags <- if not $ tmrDeferedError tmr hiDiags <- if not $ tmrDeferredError tmr
then liftIO $ writeHiFile se hsc hiFile then liftIO $ writeHiFile se hsc hiFile
else pure [] else pure []
@ -1057,7 +1057,7 @@ getClientSettingsRule recorder = defineEarlyCutOffNoFile (cmapWithPrio LogShake
settings <- clientSettings <$> getIdeConfiguration settings <- clientSettings <$> getIdeConfiguration
return (LBS.toStrict $ B.encode $ hash settings, settings) return (LBS.toStrict $ B.encode $ hash settings, settings)
-- | Returns the client configurarion stored in the IdeState. -- | Returns the client configuration stored in the IdeState.
-- You can use this function to access it from shake Rules -- You can use this function to access it from shake Rules
getClientConfigAction :: Config -- ^ default value getClientConfigAction :: Config -- ^ default value
-> Action Config -> Action Config

View File

@ -256,7 +256,7 @@ data ShakeExtras = ShakeExtras
-- ^ Map from a text document version to a PositionMapping that describes how to map -- ^ Map from a text document version to a PositionMapping that describes how to map
-- positions in a version of that document to positions in the latest version -- positions in a version of that document to positions in the latest version
-- First mapping is delta from previous version and second one is an -- First mapping is delta from previous version and second one is an
-- accumlation of all previous mappings. -- accumulation of all previous mappings.
,progress :: ProgressReporting ,progress :: ProgressReporting
,ideTesting :: IdeTesting ,ideTesting :: IdeTesting
-- ^ Whether to enable additional lsp messages used by the test suite for checking invariants -- ^ Whether to enable additional lsp messages used by the test suite for checking invariants
@ -280,12 +280,12 @@ data ShakeExtras = ShakeExtras
, withHieDb :: WithHieDb -- ^ Use only to read. , withHieDb :: WithHieDb -- ^ Use only to read.
, hiedbWriter :: HieDbWriter -- ^ use to write , hiedbWriter :: HieDbWriter -- ^ use to write
, persistentKeys :: TVar (KeyMap GetStalePersistent) , persistentKeys :: TVar (KeyMap GetStalePersistent)
-- ^ Registery for functions that compute/get "stale" results for the rule -- ^ Registry for functions that compute/get "stale" results for the rule
-- (possibly from disk) -- (possibly from disk)
, vfsVar :: TVar VFS , vfsVar :: TVar VFS
-- ^ A snapshot of the current state of the virtual file system. Updated on shakeRestart -- ^ A snapshot of the current state of the virtual file system. Updated on shakeRestart
-- VFS state is managed by LSP. However, the state according to the lsp library may be newer than the state of the current session, -- VFS state is managed by LSP. However, the state according to the lsp library may be newer than the state of the current session,
-- leaving us vulnerable to suble race conditions. To avoid this, we take a snapshot of the state of the VFS on every -- leaving us vulnerable to subtle race conditions. To avoid this, we take a snapshot of the state of the VFS on every
-- restart, so that the whole session sees a single consistent view of the VFS. -- restart, so that the whole session sees a single consistent view of the VFS.
-- We don't need a STM.Map because we never update individual keys ourselves. -- We don't need a STM.Map because we never update individual keys ourselves.
, defaultConfig :: Config , defaultConfig :: Config
@ -662,7 +662,7 @@ getStateKeys = (fmap.fmap) fst . atomically . ListT.toList . STM.listT . state
-- | Must be called in the 'Initialized' handler and only once -- | Must be called in the 'Initialized' handler and only once
shakeSessionInit :: Recorder (WithPriority Log) -> IdeState -> IO () shakeSessionInit :: Recorder (WithPriority Log) -> IdeState -> IO ()
shakeSessionInit recorder ide@IdeState{..} = do shakeSessionInit recorder ide@IdeState{..} = do
-- Take a snapshot of the VFS - it should be empty as we've recieved no notifications -- Take a snapshot of the VFS - it should be empty as we've received no notifications
-- till now, but it can't hurt to be in sync with the `lsp` library. -- till now, but it can't hurt to be in sync with the `lsp` library.
vfs <- vfsSnapshot (lspEnv shakeExtras) vfs <- vfsSnapshot (lspEnv shakeExtras)
initSession <- newSession recorder shakeExtras (VFSModified vfs) shakeDb [] "shakeSessionInit" initSession <- newSession recorder shakeExtras (VFSModified vfs) shakeDb [] "shakeSessionInit"
@ -831,7 +831,7 @@ instantiateDelayedAction (DelayedAction _ s p a) = do
b <- newBarrier b <- newBarrier
let a' = do let a' = do
-- work gets reenqueued when the Shake session is restarted -- work gets reenqueued when the Shake session is restarted
-- it can happen that a work item finished just as it was reenqueud -- it can happen that a work item finished just as it was reenqueued
-- in that case, skipping the work is fine -- in that case, skipping the work is fine
alreadyDone <- liftIO $ isJust <$> waitBarrierMaybe b alreadyDone <- liftIO $ isJust <$> waitBarrierMaybe b
unless alreadyDone $ do unless alreadyDone $ do

View File

@ -3,7 +3,7 @@
-- Copied from https://github.com/ghc/ghc/blob/master/compiler/main/DriverPipeline.hs on 14 May 2019 -- Copied from https://github.com/ghc/ghc/blob/master/compiler/main/DriverPipeline.hs on 14 May 2019
-- Requested to be exposed at https://gitlab.haskell.org/ghc/ghc/merge_requests/944. -- Requested to be exposed at https://gitlab.haskell.org/ghc/ghc/merge_requests/944.
-- Update the above MR got merged to master on 31 May 2019. When it becomes avialable to ghc-lib, this file can be removed. -- Update the above MR got merged to master on 31 May 2019. When it becomes available to ghc-lib, this file can be removed.
{- HLINT ignore -} -- since copied from upstream {- HLINT ignore -} -- since copied from upstream

View File

@ -2,7 +2,7 @@
{-# LANGUAGE PatternSynonyms #-} {-# LANGUAGE PatternSynonyms #-}
{-# HLINT ignore "Unused LANGUAGE pragma" #-} {-# HLINT ignore "Unused LANGUAGE pragma" #-}
-- | Parser compaibility module. -- | Parser compatibility module.
module Development.IDE.GHC.Compat.Parser ( module Development.IDE.GHC.Compat.Parser (
initParserOpts, initParserOpts,
initParserState, initParserState,

View File

@ -183,7 +183,7 @@ notFoundErr env modName reason =
mkError' = diagFromString "not found" DsError (Compat.getLoc modName) mkError' = diagFromString "not found" DsError (Compat.getLoc modName)
modName0 = unLoc modName modName0 = unLoc modName
ppr' = showSDoc dfs ppr' = showSDoc dfs
-- We convert the lookup result to a find result to reuse GHC's cannotFindMoudle pretty printer. -- We convert the lookup result to a find result to reuse GHC's cannotFindModule pretty printer.
lookupToFindResult = lookupToFindResult =
\case \case
LookupFound _m _pkgConfig -> LookupFound _m _pkgConfig ->

View File

@ -59,8 +59,8 @@ heapStatsThread l = forever $ do
threadDelay heapStatsInterval threadDelay heapStatsInterval
logHeapStats l logHeapStats l
-- | A helper function which lauches the 'heapStatsThread' and kills it -- | A helper function which launches the 'heapStatsThread' and kills it
-- appropiately when the inner action finishes. It also checks to see -- appropriately when the inner action finishes. It also checks to see
-- if `-T` is enabled. -- if `-T` is enabled.
withHeapStats :: Recorder (WithPriority Log) -> IO r -> IO r withHeapStats :: Recorder (WithPriority Log) -> IO r -> IO r
withHeapStats l k = do withHeapStats l k = do

View File

@ -80,7 +80,7 @@ produceCompletions recorder = do
_ -> return ([], Nothing) _ -> return ([], Nothing)
define (cmapWithPrio LogShake recorder) $ \NonLocalCompletions file -> do define (cmapWithPrio LogShake recorder) $ \NonLocalCompletions file -> do
-- For non local completions we avoid depending on the parsed module, -- For non local completions we avoid depending on the parsed module,
-- synthetizing a fake module with an empty body from the buffer -- synthesizing a fake module with an empty body from the buffer
-- in the ModSummary, which preserves all the imports -- in the ModSummary, which preserves all the imports
ms <- fmap fst <$> useWithStale GetModSummaryWithoutTimestamps file ms <- fmap fst <$> useWithStale GetModSummaryWithoutTimestamps file
sess <- fmap fst <$> useWithStale GhcSessionDeps file sess <- fmap fst <$> useWithStale GhcSessionDeps file

View File

@ -366,11 +366,11 @@ cacheDataProducer uri env curMod globalEnv inScopeEnv limports = do
asNamespace :: ImportDecl GhcPs -> ModuleName asNamespace :: ImportDecl GhcPs -> ModuleName
asNamespace imp = maybe (iDeclToModName imp) GHC.unLoc (ideclAs imp) asNamespace imp = maybe (iDeclToModName imp) GHC.unLoc (ideclAs imp)
-- Full canonical names of imported modules -- Full canonical names of imported modules
importDeclerations = map unLoc limports importDeclarations = map unLoc limports
-- The given namespaces for the imported modules (ie. full name, or alias if used) -- The given namespaces for the imported modules (ie. full name, or alias if used)
allModNamesAsNS = map (showModName . asNamespace) importDeclerations allModNamesAsNS = map (showModName . asNamespace) importDeclarations
rdrElts = globalRdrEnvElts globalEnv rdrElts = globalRdrEnvElts globalEnv

View File

@ -128,7 +128,7 @@ getDocumentation
-- TODO : Build a version of GHC exactprint to extract this information -- TODO : Build a version of GHC exactprint to extract this information
-- more accurately. -- more accurately.
-- TODO : Implement this for GHC 9.2 with in-tree annotations -- TODO : Implement this for GHC 9.2 with in-tree annotations
-- (alternatively, just remove it and rely soley on GHC's parsing) -- (alternatively, just remove it and rely solely on GHC's parsing)
getDocumentation sources targetName = fromMaybe [] $ do getDocumentation sources targetName = fromMaybe [] $ do
#if MIN_VERSION_ghc(9,2,0) #if MIN_VERSION_ghc(9,2,0)
Nothing Nothing
@ -137,7 +137,7 @@ getDocumentation sources targetName = fromMaybe [] $ do
targetNameSpan <- realSpan $ getLoc targetName targetNameSpan <- realSpan $ getLoc targetName
tc <- tc <-
find ((==) (Just $ srcSpanFile targetNameSpan) . annotationFileName) find ((==) (Just $ srcSpanFile targetNameSpan) . annotationFileName)
$ reverse sources -- TODO : Is reversing the list here really neccessary? $ reverse sources -- TODO : Is reversing the list here really necessary?
-- Top level names bound by the module -- Top level names bound by the module
let bs = [ n | let L _ HsModule{hsmodDecls} = pm_parsed_source tc let bs = [ n | let L _ HsModule{hsmodDecls} = pm_parsed_source tc

View File

@ -593,7 +593,7 @@ diagnosticTests = testGroup "diagnostics"
[ ( "Foo.hs" [ ( "Foo.hs"
-- The test is to make sure that warnings contain unqualified names -- The test is to make sure that warnings contain unqualified names
-- where appropriate. The warning should use an unqualified name 'Ord', not -- where appropriate. The warning should use an unqualified name 'Ord', not
-- sometihng like 'GHC.Classes.Ord'. The choice of redundant-constraints to -- something like 'GHC.Classes.Ord'. The choice of redundant-constraints to
-- test this is fairly arbitrary. -- test this is fairly arbitrary.
, [(DsWarning, (2, if ghcVersion >= GHC94 then 7 else 0), "Redundant constraint: Ord a") , [(DsWarning, (2, if ghcVersion >= GHC94 then 7 else 0), "Redundant constraint: Ord a")
] ]
@ -2593,7 +2593,7 @@ simpleMultiTest3 =
checkDefs locs (pure [fooL]) checkDefs locs (pure [fooL])
expectNoMoreDiagnostics 0.5 expectNoMoreDiagnostics 0.5
-- Like simpleMultiTest but open the files in component 'a' in a seperate session -- Like simpleMultiTest but open the files in component 'a' in a separate session
simpleMultiDefTest :: TestTree simpleMultiDefTest :: TestTree
simpleMultiDefTest = testCase "simple-multi-def-test" $ runWithExtraFiles "multi" $ \dir -> do simpleMultiDefTest = testCase "simple-multi-def-test" $ runWithExtraFiles "multi" $ \dir -> do
let aPath = dir </> "a/A.hs" let aPath = dir </> "a/A.hs"
@ -2670,7 +2670,7 @@ ifaceTHTest = testCase "iface-th-test" $ runWithExtraFiles "TH" $ \dir -> do
-- Change [TH]a from () to Bool -- Change [TH]a from () to Bool
liftIO $ writeFileUTF8 aPath (unlines $ init (lines $ T.unpack aSource) ++ ["th_a = [d| a = False|]"]) liftIO $ writeFileUTF8 aPath (unlines $ init (lines $ T.unpack aSource) ++ ["th_a = [d| a = False|]"])
-- Check that the change propogates to C -- Check that the change propagates to C
changeDoc cdoc [TextDocumentContentChangeEvent Nothing Nothing cSource] changeDoc cdoc [TextDocumentContentChangeEvent Nothing Nothing cSource]
expectDiagnostics expectDiagnostics
[("THC.hs", [(DsError, (4, 4), "Couldn't match expected type '()' with actual type 'Bool'")]) [("THC.hs", [(DsError, (4, 4), "Couldn't match expected type '()' with actual type 'Bool'")])
@ -2694,11 +2694,11 @@ ifaceErrorTest = testCase "iface-error-test-1" $ runWithExtraFiles "recomp" $ \d
-- Change y from Int to B -- Change y from Int to B
changeDoc bdoc [TextDocumentContentChangeEvent Nothing Nothing $ T.unlines ["module B where", "y :: Bool", "y = undefined"]] changeDoc bdoc [TextDocumentContentChangeEvent Nothing Nothing $ T.unlines ["module B where", "y :: Bool", "y = undefined"]]
-- save so that we can that the error propogates to A -- save so that we can that the error propagates to A
sendNotification STextDocumentDidSave (DidSaveTextDocumentParams bdoc Nothing) sendNotification STextDocumentDidSave (DidSaveTextDocumentParams bdoc Nothing)
-- Check that the error propogates to A -- Check that the error propagates to A
expectDiagnostics expectDiagnostics
[("A.hs", [(DsError, (5, 4), "Couldn't match expected type 'Int' with actual type 'Bool'")])] [("A.hs", [(DsError, (5, 4), "Couldn't match expected type 'Int' with actual type 'Bool'")])]

View File

@ -981,7 +981,7 @@ function speculativeCriticalPath(profile) {
return maxCriticalPath; return maxCriticalPath;
} }
/* /*
Calculating a precise critical path, taking into account the deep dependeny structure, is non-obvious. Calculating a precise critical path, taking into account the deep dependency structure, is non-obvious.
Dependencies have the type [{X}], e.g: Dependencies have the type [{X}], e.g:
X = [{a,b},{c,d}] X = [{a,b},{c,d}]

View File

@ -58,7 +58,7 @@ data TrackSTMConf = TrackSTMConf
{ tryThreshold :: Maybe Int { tryThreshold :: Maybe Int
-- ^ If the number of retries of one transaction run reaches this -- ^ If the number of retries of one transaction run reaches this
-- count, a warning is issued at runtime. If set to @Nothing@, disables the warnings completely. -- count, a warning is issued at runtime. If set to @Nothing@, disables the warnings completely.
, globalTheshold :: Maybe Int , globalThreshold :: Maybe Int
-- ^ If the total number of retries of one named transaction reaches -- ^ If the total number of retries of one named transaction reaches
-- this count, a warning is issued. If set to @Nothing@, disables the -- this count, a warning is issued. If set to @Nothing@, disables the
-- warnings completely. -- warnings completely.
@ -79,7 +79,7 @@ data TrackSTMConf = TrackSTMConf
-- --
-- > defaultTrackSTMConf = TrackSTMConf -- > defaultTrackSTMConf = TrackSTMConf
-- > { tryThreshold = Just 10 -- > { tryThreshold = Just 10
-- > , globalTheshold = Just 3000 -- > , globalThreshold = Just 3000
-- > , exception = True -- > , exception = True
-- > , warnFunction = hPutStrLn stderr -- > , warnFunction = hPutStrLn stderr
-- > , warnInSTMFunction = \_ -> return () -- > , warnInSTMFunction = \_ -> return ()
@ -87,7 +87,7 @@ data TrackSTMConf = TrackSTMConf
defaultTrackSTMConf :: TrackSTMConf defaultTrackSTMConf :: TrackSTMConf
defaultTrackSTMConf = TrackSTMConf defaultTrackSTMConf = TrackSTMConf
{ tryThreshold = Just 10 { tryThreshold = Just 10
, globalTheshold = Just 3000 , globalThreshold = Just 3000
, extendException = True , extendException = True
, warnFunction = hPutStrLn stderr , warnFunction = hPutStrLn stderr
, warnInSTMFunction = \_ -> return () , warnInSTMFunction = \_ -> return ()
@ -143,7 +143,7 @@ trackSTMConf (TrackSTMConf {..}) name txm = do
(1,i) (1,i)
m m
in (m', let j = maybe 0 snd oldVal in (j,j+i)) in (m', let j = maybe 0 snd oldVal in (j,j+i))
doMB globalTheshold $ \globalRetryThreshold -> doMB globalThreshold $ \globalRetryThreshold ->
when (k `div` globalRetryThreshold /= k' `div` globalRetryThreshold) $ when (k `div` globalRetryThreshold /= k' `div` globalRetryThreshold) $
warnFunction $ msgPrefix ++ " reached global retry count of " ++ show k' warnFunction $ msgPrefix ++ " reached global retry count of " ++ show k'

View File

@ -173,7 +173,7 @@ idePluginsToPluginDesc (IdePlugins pp) = pp
-- --------------------------------------------------------------------- -- ---------------------------------------------------------------------
-- | Returns the current client configuration. It is not wise to permanently -- | Returns the current client configuration. It is not wise to permanently
-- cache the returned value of this function, as clients can at runitime change -- cache the returned value of this function, as clients can at runtime change
-- their configuration. -- their configuration.
-- --
getClientConfig :: MonadLsp Config m => m Config getClientConfig :: MonadLsp Config m => m Config

View File

@ -62,13 +62,13 @@ addMethodPlaceholders _ state param@AddMinimalMethodsParams{..} = do
pure Null pure Null
where where
toTextDocunemtEdit edit = toTextDocumentEdit edit =
TextDocumentEdit (VersionedTextDocumentIdentifier uri (Just 0)) (List [InL edit]) TextDocumentEdit (VersionedTextDocumentIdentifier uri (Just 0)) (List [InL edit])
mergeEdit :: WorkspaceEdit -> [TextEdit] -> WorkspaceEdit mergeEdit :: WorkspaceEdit -> [TextEdit] -> WorkspaceEdit
mergeEdit WorkspaceEdit{..} edits = WorkspaceEdit mergeEdit WorkspaceEdit{..} edits = WorkspaceEdit
{ _documentChanges = { _documentChanges =
(\(List x) -> List $ x ++ map (InL . toTextDocunemtEdit) edits) (\(List x) -> List $ x ++ map (InL . toTextDocumentEdit) edits)
<$> _documentChanges <$> _documentChanges
, .. , ..
} }

View File

@ -68,7 +68,7 @@ data Log
instance Pretty Log where instance Pretty Log where
pretty = \case pretty = \case
LogImplementedMethods cls methods -> LogImplementedMethods cls methods ->
pretty ("Detected implmented methods for class" :: String) pretty ("Detected implemented methods for class" :: String)
<+> pretty (show (getOccString cls) <> ":") -- 'show' is used here to add quotes around the class name <+> pretty (show (getOccString cls) <> ":") -- 'show' is used here to add quotes around the class name
<+> pretty methods <+> pretty methods
LogShake log -> pretty log LogShake log -> pretty log

View File

@ -144,7 +144,7 @@ getSelectionRanges :: NormalizedFilePath -> [Position] -> ExceptT SelectionRange
getSelectionRanges file positions = do getSelectionRanges file positions = do
(codeRange, positionMapping) <- maybeToExceptT (SelectionRangeBadDependency GetCodeRange) . MaybeT $ (codeRange, positionMapping) <- maybeToExceptT (SelectionRangeBadDependency GetCodeRange) . MaybeT $
useWithStaleFast GetCodeRange file useWithStaleFast GetCodeRange file
-- 'positionMapping' should be appied to the input before using them -- 'positionMapping' should be applied to the input before using them
positions' <- maybeToExceptT SelectionRangeInputPositionMappingFailure . MaybeT . pure $ positions' <- maybeToExceptT SelectionRangeInputPositionMappingFailure . MaybeT . pure $
traverse (fromCurrentPosition positionMapping) positions traverse (fromCurrentPosition positionMapping) positions

View File

@ -168,7 +168,7 @@ identifierForTypeSig node =
(fmap fst . find (\(_, detail) -> TyDecl `Set.member` identInfo detail) (fmap fst . find (\(_, detail) -> TyDecl `Set.member` identInfo detail)
. Map.toList . nodeIdentifiers) . Map.toList . nodeIdentifiers)
-- | Determines if the given occurence of an identifier is a function/variable definition in the outer span -- | Determines if the given occurrence of an identifier is a function/variable definition in the outer span
isIdentADef :: Span -> (Span, IdentifierDetails a) -> Bool isIdentADef :: Span -> (Span, IdentifierDetails a) -> Bool
isIdentADef outerSpan (span, detail) = isIdentADef outerSpan (span, detail) =
realSrcSpanStart span >= realSrcSpanStart outerSpan && realSrcSpanEnd span <= realSrcSpanEnd outerSpan realSrcSpanStart span >= realSrcSpanStart outerSpan && realSrcSpanEnd span <= realSrcSpanEnd outerSpan

View File

@ -79,13 +79,13 @@ h98ToGADTDecl = \case
} }
x -> x x -> x
-- | Convert H98 data constuctor to GADT data constructor -- | Convert H98 data constructor to GADT data constructor
h98ToGADTConDecl :: h98ToGADTConDecl ::
LIdP GP -- ^Type constuctor name, LIdP GP -- ^Type constructor name,
-- used for constucting final result type in GADT -- used for constructing final result type in GADT
-> LHsQTyVars GP -> LHsQTyVars GP
-- ^Type variable names -- ^Type variable names
-- used for constucting final result type in GADT -- used for constructing final result type in GADT
-> Maybe (LHsContext GP) -> Maybe (LHsContext GP)
-- ^Data type context -- ^Data type context
-> ConDecl GP -> ConDecl GP
@ -203,7 +203,7 @@ prettyGADTDecl df decl =
where where
go (Anchor a _) = Anchor a (MovedAnchor (DifferentLine 1 2)) go (Anchor a _) = Anchor a (MovedAnchor (DifferentLine 1 2))
-- Adjust where annotation to the same line of the type constuctor -- Adjust where annotation to the same line of the type constructor
adjustWhere tcdDExt = tcdDExt <&> map adjustWhere tcdDExt = tcdDExt <&> map
(\(AddEpAnn ann l) -> (\(AddEpAnn ann l) ->
if ann == AnnWhere if ann == AnnWhere
@ -237,7 +237,7 @@ prettyGADTDecl df decl =
| isConDeclGADTAnn key = adjustCon ann | isConDeclGADTAnn key = adjustCon ann
| otherwise = ann | otherwise = ann
-- Adjust where annotation to the same line of the type constuctor -- Adjust where annotation to the same line of the type constructor
adjustWhere Ann{..} = Ann adjustWhere Ann{..} = Ann
{ annsDP = annsDP <&> { annsDP = annsDP <&>
(\(keyword, dp) -> (\(keyword, dp) ->
@ -249,7 +249,7 @@ prettyGADTDecl df decl =
-- Make every data constructor start with a new line and 2 spaces -- Make every data constructor start with a new line and 2 spaces
-- --
-- Here we can't force every GADT constuctor has (1, 2) -- Here we can't force every GADT constructor has (1, 2)
-- delta. For the first constructor with (1, 2), it prints -- delta. For the first constructor with (1, 2), it prints
-- a new line with 2 spaces, but for other constructors -- a new line with 2 spaces, but for other constructors
-- with (1, 2), it will print a new line with 4 spaces. -- with (1, 2), it will print a new line with 4 spaces.

View File

@ -32,7 +32,7 @@ tests = testGroup "GADT"
, runTest "DataContext" "DataContext" 2 0 2 31 , runTest "DataContext" "DataContext" 2 0 2 31
, runTest "DataContextParen" "DataContextParen" 2 0 3 6 , runTest "DataContextParen" "DataContextParen" 2 0 3 6
, runTest "Forall" "Forall" 2 0 2 44 , runTest "Forall" "Forall" 2 0 2 44
, runTest "ConstuctorContext" "ConstructorContext" 2 0 2 38 , runTest "ConstructorContext" "ConstructorContext" 2 0 2 38
, runTest "Context" "Context" 2 0 4 41 , runTest "Context" "Context" 2 0 4 41
, runTest "Pragma" "Pragma" 2 0 3 29 , runTest "Pragma" "Pragma" 2 0 3 29
, onlyWorkForGhcVersions (==GHC92) "Single deriving has different output on ghc9.2" $ , onlyWorkForGhcVersions (==GHC92) "Single deriving has different output on ghc9.2" $

View File

@ -47,10 +47,10 @@ addHaddockCommentsToList
:: (Data a, Monad m) :: (Data a, Monad m)
=> Bool -- ^ If true, for each node, use previous node in the list as the anchor. Otherwise, use the outer node => Bool -- ^ If true, for each node, use previous node in the list as the anchor. Otherwise, use the outer node
-> SrcSpan -- ^ The outer node -> SrcSpan -- ^ The outer node
-> KeywordId -- ^ The seperator between adjacent nodes -> KeywordId -- ^ The separator between adjacent nodes
-> [Located a] -- ^ The list of nodes. Haddock comments will be added to each of them -> [Located a] -- ^ The list of nodes. Haddock comments will be added to each of them
-> TransformT m () -> TransformT m ()
addHaddockCommentsToList usePrevNodeAsAnchor outerLoc seperator nodes = addHaddockCommentsToList usePrevNodeAsAnchor outerLoc separator nodes =
-- If you want to understand this function, please first read this page carefully: -- If you want to understand this function, please first read this page carefully:
-- https://hackage.haskell.org/package/ghc-exactprint-0.6.4/docs/Language-Haskell-GHC-ExactPrint-Delta.html -- https://hackage.haskell.org/package/ghc-exactprint-0.6.4/docs/Language-Haskell-GHC-ExactPrint-Delta.html
-- The important part is that for DP(r,c), if r is zero, c is the offset start from the end of the previous node. -- The important part is that for DP(r,c), if r is zero, c is the offset start from the end of the previous node.
@ -71,23 +71,23 @@ addHaddockCommentsToList usePrevNodeAsAnchor outerLoc seperator nodes =
-- For the multiline case (which is the most common), we keep the original indentation of each constructor -- For the multiline case (which is the most common), we keep the original indentation of each constructor
-- and field. -- and field.
-- --
-- For the inline case, we use the first construcotr/field as the base, and align all following items -- For the inline case, we use the first constructor/field as the base, and align all following items
-- to them. -- to them.
let sameLineAsPrev = prevNode >>= ( let sameLineAsPrev = prevNode >>= (
\prevNode' -> if notSeperatedByLineEnding prevNode' node \prevNode' -> if notSeparatedByLineEnding prevNode' node
then pure prevNode' then pure prevNode'
else Nothing else Nothing
) )
-- For the inline case, we need to move the seperator to the next line. -- For the inline case, we need to move the separator to the next line.
-- For constructors, it's vertical bar; for fields, it's comma. -- For constructors, it's vertical bar; for fields, it's comma.
-- The seperator is passed in as function argument. -- The separator is passed in as function argument.
when (isJust sameLineAsPrev) $ modifyAnnsT $ \anns -> when (isJust sameLineAsPrev) $ modifyAnnsT $ \anns ->
let newSepCol :: Annotation -> Int let newSepCol :: Annotation -> Int
newSepCol ann = newSepCol ann =
if usePrevNodeAsAnchor then 0 else deltaColumn (annEntryDelta ann) if usePrevNodeAsAnchor then 0 else deltaColumn (annEntryDelta ann)
updateSepAnn :: Annotation -> Annotation updateSepAnn :: Annotation -> Annotation
updateSepAnn ann = ann {annsDP = updateSepAnn ann = ann {annsDP =
Map.toList . Map.adjust (const $ DP (1, newSepCol ann)) seperator . Map.fromList $ annsDP ann} Map.toList . Map.adjust (const $ DP (1, newSepCol ann)) separator . Map.fromList $ annsDP ann}
in flip (maybe anns) prevNode $ \prevNode' -> Map.adjust updateSepAnn (mkAnnKey prevNode') anns in flip (maybe anns) prevNode $ \prevNode' -> Map.adjust updateSepAnn (mkAnnKey prevNode') anns
-- Calculate the real column of the anchor -- Calculate the real column of the anchor
let anchorCol = maybe 0 srcSpanStartCol . realSpan . maybe outerLoc getLoc $ let anchorCol = maybe 0 srcSpanStartCol . realSpan . maybe outerLoc getLoc $
@ -110,7 +110,7 @@ addHaddockCommentsToList usePrevNodeAsAnchor outerLoc seperator nodes =
let updateCurrent :: Annotation -> Annotation let updateCurrent :: Annotation -> Annotation
updateCurrent ann = ann { updateCurrent ann = ann {
-- If there exist non-haddock comments, we simply inherit the first one's delta pos, -- If there exist non-haddock comments, we simply inherit the first one's delta pos,
-- and move them two lines below, to seperate them from our newly added haddock comments -- and move them two lines below, to separate them from our newly added haddock comments
-- Otherwise, inherit the node's entry delta pos. -- Otherwise, inherit the node's entry delta pos.
annPriorComments = case annPriorComments ann of annPriorComments = case annPriorComments ann of
(c, dp) : rem -> (emptyPriorHaddockComment, dp) : (c, DP (2,0)) : rem (c, dp) : rem -> (emptyPriorHaddockComment, dp) : (c, DP (2,0)) : rem
@ -127,12 +127,12 @@ missingSomeHaddock anns = any $ \lcon@(L _ conDecl) -> case conDecl of
_ -> False -- GADT is not supported yet _ -> False -- GADT is not supported yet
-- | Returns 'True' if the end of the first node and the start of the second node are on the same line. -- | Returns 'True' if the end of the first node and the start of the second node are on the same line.
notSeperatedByLineEnding :: Located a notSeparatedByLineEnding :: Located a
-> Located a -> Located a
-> Bool -> Bool
notSeperatedByLineEnding (L (RealSrcSpan x _) _) (L (RealSrcSpan y _) _) = notSeparatedByLineEnding (L (RealSrcSpan x _) _) (L (RealSrcSpan y _) _) =
srcLocLine (realSrcSpanEnd x) == srcLocLine (realSrcSpanStart y) srcLocLine (realSrcSpanEnd x) == srcLocLine (realSrcSpanStart y)
notSeperatedByLineEnding _ _ = False notSeparatedByLineEnding _ _ = False
-- | Empty haddock, suitable for being added to 'annPriorComments' -- | Empty haddock, suitable for being added to 'annPriorComments'
emptyPriorHaddockComment :: Comment emptyPriorHaddockComment :: Comment

View File

@ -430,7 +430,7 @@ codeActionProvider ideState pluginId (CodeActionParams _ _ documentId _ context)
LSP.List diags = context ^. LSP.diagnostics LSP.List diags = context ^. LSP.diagnostics
-- | Convert a hlint diagonistic into an apply and an ignore code action -- | Convert a hlint diagnostic into an apply and an ignore code action
-- if applicable -- if applicable
diagnosticToCodeActions :: DynFlags -> T.Text -> PluginId -> TextDocumentIdentifier -> LSP.Diagnostic -> [LSP.CodeAction] diagnosticToCodeActions :: DynFlags -> T.Text -> PluginId -> TextDocumentIdentifier -> LSP.Diagnostic -> [LSP.CodeAction]
diagnosticToCodeActions dynFlags fileContents pluginId documentId diagnostic diagnosticToCodeActions dynFlags fileContents pluginId documentId diagnostic
@ -555,7 +555,7 @@ applyHint recorder ide nfp mhint =
modsum <- liftIO $ runAction' $ use_ GetModSummary nfp modsum <- liftIO $ runAction' $ use_ GetModSummary nfp
let dflags = ms_hspp_opts $ msrModSummary modsum let dflags = ms_hspp_opts $ msrModSummary modsum
-- Setting a environment variable with the libdir used by ghc-exactprint. -- Setting a environment variable with the libdir used by ghc-exactprint.
-- It is a workaround for an error caused by the use of a hadcoded at compile time libdir -- It is a workaround for an error caused by the use of a hardcoded at compile time libdir
-- in ghc-exactprint that makes dependent executables non portables. -- in ghc-exactprint that makes dependent executables non portables.
-- See https://github.com/alanz/ghc-exactprint/issues/96. -- See https://github.com/alanz/ghc-exactprint/issues/96.
-- WARNING: this code is not thread safe, so if you try to apply several async refactorings -- WARNING: this code is not thread safe, so if you try to apply several async refactorings

View File

@ -319,7 +319,7 @@ getNeedsSpaceAndParenthesize ::
getNeedsSpaceAndParenthesize dst a = getNeedsSpaceAndParenthesize dst a =
-- Traverse the tree, looking for our replacement node. But keep track of -- Traverse the tree, looking for our replacement node. But keep track of
-- the context (parent HsExpr constructor) we're in while we do it. This -- the context (parent HsExpr constructor) we're in while we do it. This
-- lets us determine wehther or not we need parentheses. -- lets us determine whether or not we need parentheses.
let (needs_parens, needs_space) = let (needs_parens, needs_space) =
everythingWithContext (Nothing, Nothing) (<>) everythingWithContext (Nothing, Nothing) (<>)
( mkQ (mempty, ) $ \x s -> case x of ( mkQ (mempty, ) $ \x s -> case x of

View File

@ -160,7 +160,7 @@ typeSigsPluginDescriptor recorder plId = mkExactprintPluginDescriptor recorder $
mkGhcideCAsPlugin [ mkGhcideCAsPlugin [
wrap $ suggestSignature True wrap $ suggestSignature True
, wrap suggestFillTypeWildcard , wrap suggestFillTypeWildcard
, wrap suggestAddTypeAnnotationToSatisfyContraints , wrap suggestAddTypeAnnotationToSatisfyConstraints
#if !MIN_VERSION_ghc(9,3,0) #if !MIN_VERSION_ghc(9,3,0)
, wrap removeRedundantConstraints , wrap removeRedundantConstraints
, wrap suggestConstraint , wrap suggestConstraint
@ -396,7 +396,7 @@ suggestHideShadow ps fileContents mTcM mHar Diagnostic {_message, _range}
Just matched <- allMatchRegexUnifySpaces _message "imported from ([^]+) at ([^ ]*)", Just matched <- allMatchRegexUnifySpaces _message "imported from ([^]+) at ([^ ]*)",
mods <- [(modName, s) | [_, modName, s] <- matched], mods <- [(modName, s) | [_, modName, s] <- matched],
result <- nubOrdBy (compare `on` fst) $ mods >>= uncurry (suggests identifier), result <- nubOrdBy (compare `on` fst) $ mods >>= uncurry (suggests identifier),
hideAll <- ("Hide " <> identifier <> " from all occurence imports", concatMap snd result) = hideAll <- ("Hide " <> identifier <> " from all occurrence imports", concatMap snd result) =
result <> [hideAll] result <> [hideAll]
| otherwise = [] | otherwise = []
where where
@ -793,8 +793,8 @@ suggestExportUnusedTopBinding srcOpt ParsedModule{pm_parsed_source = L _ HsModul
exportsAs (TyClD _ FamDecl{tcdFam}) = Just (ExportFamily, reLoc $ fdLName tcdFam) exportsAs (TyClD _ FamDecl{tcdFam}) = Just (ExportFamily, reLoc $ fdLName tcdFam)
exportsAs _ = Nothing exportsAs _ = Nothing
suggestAddTypeAnnotationToSatisfyContraints :: Maybe T.Text -> Diagnostic -> [(T.Text, [TextEdit])] suggestAddTypeAnnotationToSatisfyConstraints :: Maybe T.Text -> Diagnostic -> [(T.Text, [TextEdit])]
suggestAddTypeAnnotationToSatisfyContraints sourceOpt Diagnostic{_range=_range,..} suggestAddTypeAnnotationToSatisfyConstraints sourceOpt Diagnostic{_range=_range,..}
-- File.hs:52:41: warning: -- File.hs:52:41: warning:
-- * Defaulting the following constraint to type Integer -- * Defaulting the following constraint to type Integer
-- Num p0 arising from the literal 1 -- Num p0 arising from the literal 1
@ -1734,7 +1734,7 @@ findPositionAfterModuleName ps hsmodName' = do
prevSrcSpan = maybe (getLoc hsmodName') getLoc hsmodExports prevSrcSpan = maybe (getLoc hsmodName') getLoc hsmodExports
-- The relative position of 'where' keyword (in lines, relative to the previous AST node). -- The relative position of 'where' keyword (in lines, relative to the previous AST node).
-- The exact-print API changed a lot in ghc-9.2, so we need to handle it seperately for different compiler versions. -- The exact-print API changed a lot in ghc-9.2, so we need to handle it separately for different compiler versions.
whereKeywordLineOffset :: Maybe Int whereKeywordLineOffset :: Maybe Int
#if MIN_VERSION_ghc(9,2,0) #if MIN_VERSION_ghc(9,2,0)
whereKeywordLineOffset = case hsmodAnn of whereKeywordLineOffset = case hsmodAnn of
@ -1767,7 +1767,7 @@ findPositionAfterModuleName ps hsmodName' = do
deltaPos <- fmap NE.head . NE.nonEmpty .mapMaybe filterWhere $ annsDP ann deltaPos <- fmap NE.head . NE.nonEmpty .mapMaybe filterWhere $ annsDP ann
pure $ deltaRow deltaPos pure $ deltaRow deltaPos
-- Before ghc 9.2, DeltaPos doesn't take comment into acccount, so we don't need to sum line offset of comments. -- Before ghc 9.2, DeltaPos doesn't take comment into account, so we don't need to sum line offset of comments.
filterWhere :: (KeywordId, DeltaPos) -> Maybe DeltaPos filterWhere :: (KeywordId, DeltaPos) -> Maybe DeltaPos
filterWhere (keywordId, deltaPos) = filterWhere (keywordId, deltaPos) =
if keywordId == G AnnWhere then Just deltaPos else Nothing if keywordId == G AnnWhere then Just deltaPos else Nothing

View File

@ -385,7 +385,7 @@ extendImportTopLevel thing (L l it@ImportDecl{..})
nodeHasComma x = isJust $ Map.lookup (mkAnnKey x) anns >>= find isAnnComma . annsDP nodeHasComma x = isJust $ Map.lookup (mkAnnKey x) anns >>= find isAnnComma . annsDP
when shouldAddTrailingComma (addTrailingCommaT x) when shouldAddTrailingComma (addTrailingCommaT x)
-- Parens are attachted to `lies`, so if `lies` was empty previously, -- Parens are attached to `lies`, so if `lies` was empty previously,
-- we need change the ann key from `[]` to `:` to keep parens and other anns. -- we need change the ann key from `[]` to `:` to keep parens and other anns.
unless hasSibling $ unless hasSibling $
transferAnn (L l' lies) (L l' [x]) id transferAnn (L l' lies) (L l' [x]) id
@ -435,7 +435,7 @@ extendImportViaParent df parent child (L l it@ImportDecl{..})
childLIE = reLocA $ L srcChild $ IEName childRdr childLIE = reLocA $ L srcChild $ IEName childRdr
#if !MIN_VERSION_ghc(9,2,0) #if !MIN_VERSION_ghc(9,2,0)
x :: LIE GhcPs = L ll' $ IEThingWith noExtField absIE NoIEWildcard [childLIE] [] x :: LIE GhcPs = L ll' $ IEThingWith noExtField absIE NoIEWildcard [childLIE] []
-- take anns from ThingAbs, and attatch parens to it -- take anns from ThingAbs, and attach parens to it
transferAnn lAbs x $ \old -> old{annsDP = annsDP old ++ [(G AnnOpenP, DP (0, 1)), (G AnnCloseP, dp00)]} transferAnn lAbs x $ \old -> old{annsDP = annsDP old ++ [(G AnnOpenP, DP (0, 1)), (G AnnCloseP, dp00)]}
addSimpleAnnT childRdr dp00 [(G AnnVal, dp00)] addSimpleAnnT childRdr dp00 [(G AnnVal, dp00)]
#else #else
@ -518,7 +518,7 @@ extendImportViaParent df parent child (L l it@ImportDecl{..})
addSimpleAnnT parentRdr (DP (0, if hasSibling then 1 else 0)) $ unqalDP 1 isParentOperator addSimpleAnnT parentRdr (DP (0, if hasSibling then 1 else 0)) $ unqalDP 1 isParentOperator
addSimpleAnnT childRdr (DP (0, 0)) [(G AnnVal, dp00)] addSimpleAnnT childRdr (DP (0, 0)) [(G AnnVal, dp00)]
addSimpleAnnT x (DP (0, 0)) [(G AnnOpenP, DP (0, 1)), (G AnnCloseP, DP (0, 0))] addSimpleAnnT x (DP (0, 0)) [(G AnnOpenP, DP (0, 1)), (G AnnCloseP, DP (0, 0))]
-- Parens are attachted to `pre`, so if `pre` was empty previously, -- Parens are attached to `pre`, so if `pre` was empty previously,
-- we need change the ann key from `[]` to `:` to keep parens and other anns. -- we need change the ann key from `[]` to `:` to keep parens and other anns.
unless hasSibling $ unless hasSibling $
transferAnn (L l' $ reverse pre) (L l' [x]) id transferAnn (L l' $ reverse pre) (L l' [x]) id
@ -538,7 +538,7 @@ extendImportViaParent _ _ _ _ = lift $ Left "Unable to extend the import list vi
addCommaInImportList :: addCommaInImportList ::
-- | Initial list -- | Initial list
[LocatedAn AnnListItem a] [LocatedAn AnnListItem a]
-- | Additionnal item -- | Additional item
-> LocatedAn AnnListItem a -> LocatedAn AnnListItem a
-> [LocatedAn AnnListItem a] -> [LocatedAn AnnListItem a]
addCommaInImportList lies x = addCommaInImportList lies x =

View File

@ -469,12 +469,12 @@ insertImportTests = testGroup "insert import"
"NoExplicitExports.expected.hs" "NoExplicitExports.expected.hs"
"import Data.Monoid" "import Data.Monoid"
, checkImport , checkImport
"add to correctly placed exisiting import" "add to correctly placed existing import"
"ImportAtTop.hs" "ImportAtTop.hs"
"ImportAtTop.expected.hs" "ImportAtTop.expected.hs"
"import Data.Monoid" "import Data.Monoid"
, checkImport , checkImport
"add to multiple correctly placed exisiting imports" "add to multiple correctly placed existing imports"
"MultipleImportsAtTop.hs" "MultipleImportsAtTop.hs"
"MultipleImportsAtTop.expected.hs" "MultipleImportsAtTop.expected.hs"
"import Data.Monoid" "import Data.Monoid"
@ -1856,7 +1856,7 @@ suggestHideShadowTests =
[ testGroup [ testGroup
"single" "single"
[ testOneCodeAction [ testOneCodeAction
"hide unsued" "hide unused"
"Hide on from Data.Function" "Hide on from Data.Function"
(1, 2) (1, 2)
(1, 4) (1, 4)
@ -1869,7 +1869,7 @@ suggestHideShadowTests =
, "g on = on" , "g on = on"
] ]
, testOneCodeAction , testOneCodeAction
"extend hiding unsued" "extend hiding unused"
"Hide on from Data.Function" "Hide on from Data.Function"
(1, 2) (1, 2)
(1, 4) (1, 4)
@ -1880,7 +1880,7 @@ suggestHideShadowTests =
, "f on = on" , "f on = on"
] ]
, testOneCodeAction , testOneCodeAction
"delete unsued" "delete unused"
"Hide on from Data.Function" "Hide on from Data.Function"
(1, 2) (1, 2)
(1, 4) (1, 4)
@ -1982,7 +1982,7 @@ suggestHideShadowTests =
] ]
, testOneCodeAction , testOneCodeAction
"auto hide all" "auto hide all"
"Hide ++ from all occurence imports" "Hide ++ from all occurrence imports"
(2, 2) (2, 2)
(2, 6) (2, 6)
[ "import B" [ "import B"

View File

@ -176,7 +176,7 @@ replaceRefs newName refs = everywhere $ mkT replaceLoc
--------------------------------------------------------------------------------------------------- ---------------------------------------------------------------------------------------------------
-- Reference finding -- Reference finding
-- | Note: We only find exact name occurences (i.e. type reference "depth" is 0). -- | Note: We only find exact name occurrences (i.e. type reference "depth" is 0).
refsAtName :: refsAtName ::
MonadIO m => MonadIO m =>
IdeState -> IdeState ->

View File

@ -1,2 +1,2 @@
### 0.1.1.0 (2021-02-..) ### 0.1.1.0 (2021-02-..)
* Fix bug in Retrive "fold/unfold in local file" commands (#1202) * Fix bug in Retrieve "fold/unfold in local file" commands (#1202)

View File

@ -99,7 +99,7 @@ expandTHSplice _eStyle ideState params@ExpandSpliceParams {..} = do
liftIO $ runAction "expandTHSplice.fallback.TypeCheck (stale)" ideState $ useWithStale TypeCheck fp liftIO $ runAction "expandTHSplice.fallback.TypeCheck (stale)" ideState $ useWithStale TypeCheck fp
(TcModuleResult {..}, _) <- (TcModuleResult {..}, _) <-
maybe maybe
(throwE "Splice expansion: Type-checking information not found in cache.\nYou can once delete or replace the macro with placeholder, convince the type checker and then revert to original (errornous) macro and expand splice again." (throwE "Splice expansion: Type-checking information not found in cache.\nYou can once delete or replace the macro with placeholder, convince the type checker and then revert to original (erroneous) macro and expand splice again."
) )
pure mresl pure mresl
reportEditor reportEditor
@ -166,7 +166,7 @@ expandTHSplice _eStyle ideState params@ExpandSpliceParams {..} = do
(graftDecls (RealSrcSpan spliceSpan Nothing) expanded) (graftDecls (RealSrcSpan spliceSpan Nothing) expanded)
ps ps
<&> <&>
-- FIXME: Why ghc-exactprint sweeps preceeding comments? -- FIXME: Why ghc-exactprint sweeps preceding comments?
adjustToRange uri range adjustToRange uri range
res <- liftIO $ runMaybeT $ do res <- liftIO $ runMaybeT $ do
@ -483,7 +483,7 @@ codeAction state plId (CodeActionParams _ _ docId ran _) = liftIO $
_ -> Stop _ -> Stop
-- | Like 'something', but performs top-down searching, cutoffs when 'Stop' received, -- | Like 'something', but performs top-down searching, cutoffs when 'Stop' received,
-- and picks inenrmost result. -- and picks innermost result.
something' :: forall a. GenericQ (SearchResult a) -> GenericQ (Maybe a) something' :: forall a. GenericQ (SearchResult a) -> GenericQ (Maybe a)
something' f = go something' f = go
where where

View File

@ -51,4 +51,4 @@ inplaceCmdName :: T.Text
inplaceCmdName = "expand TemplateHaskell Splice (in-place)" inplaceCmdName = "expand TemplateHaskell Splice (in-place)"
commentedCmdName :: T.Text commentedCmdName :: T.Text
commentedCmdName = "expand TemplateHaskell Splice (comented-out)" commentedCmdName = "expand TemplateHaskell Splice (commented-out)"

View File

@ -206,7 +206,7 @@ running `collapse` will produce:
arguments: single reference. arguments: single reference.
deterministic. deterministic.
> Use the given data cosntructor. > Use the given data constructor.
### Example ### Example
@ -254,7 +254,7 @@ case a of
arguments: none. arguments: none.
deterministic. deterministic.
> Pattern match on every function paramater, in original binding order. > Pattern match on every function parameter, in original binding order.
### Example ### Example
@ -358,7 +358,7 @@ running `intro aye` will produce:
## intros ## intros
arguments: varadic binding. arguments: variadic binding.
deterministic. deterministic.
> Construct a lambda expression, using the specific names if given, generating unique names otherwise. When no arguments are given, all of the function arguments will be bound; otherwise, this tactic will bind only enough to saturate the given names. Extra names are ignored. > Construct a lambda expression, using the specific names if given, generating unique names otherwise. When no arguments are given, all of the function arguments will be bound; otherwise, this tactic will bind only enough to saturate the given names. Extra names are ignored.
@ -408,7 +408,7 @@ running `intros x y z w` will produce:
## let ## let
arguments: varadic binding. arguments: variadic binding.
deterministic. deterministic.
> Create let-bindings for each binder given to this tactic. > Create let-bindings for each binder given to this tactic.

View File

@ -34,7 +34,7 @@ import Wingman.Types
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | Attact the 'Interaction's to a 'PluginDescriptor'. Interactions are -- | Attach the 'Interaction's to a 'PluginDescriptor'. Interactions are
-- self-contained request/response pairs that abstract over the LSP, and -- self-contained request/response pairs that abstract over the LSP, and
-- provide a unified interface for doing interesting things, without needing to -- provide a unified interface for doing interesting things, without needing to
-- dive into the underlying API too directly. -- dive into the underlying API too directly.

View File

@ -167,7 +167,7 @@ graftDecl dflags dst ix make_decl (L src (AMatch (FunRhs (L _ name) _ _) pats _)
-- For whatever reason, ExactPrint annotates newlines to the ends of -- For whatever reason, ExactPrint annotates newlines to the ends of
-- case matches and type signatures, but only allows us to insert -- case matches and type signatures, but only allows us to insert
-- them at the beginning of those things. Thus, we need want to -- them at the beginning of those things. Thus, we need want to
-- insert a preceeding newline (done in 'annotateDecl') on all -- insert a preceding newline (done in 'annotateDecl') on all
-- matches, except for the first one --- since it gets its newline -- matches, except for the first one --- since it gets its newline
-- from the line above. -- from the line above.
when (ix == 0) $ when (ix == 0) $

View File

@ -222,7 +222,7 @@ destruct' use_field_puns f hi jdg = do
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | Combinator for performign case splitting, and running sub-rules on the -- | Combinator for performing case splitting, and running sub-rules on the
-- resulting matches. -- resulting matches.
destructLambdaCase' :: Bool -> (ConLike -> Judgement -> Rule) -> Judgement -> Rule destructLambdaCase' :: Bool -> (ConLike -> Judgement -> Rule) -> Judgement -> Rule
destructLambdaCase' use_field_puns f jdg = do destructLambdaCase' use_field_puns f jdg = do

View File

@ -218,7 +218,7 @@ pattern Lambda pats body <-
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | A GRHS that caontains no guards. -- | A GRHS that contains no guards.
pattern UnguardedRHSs :: LHsExpr p -> GRHSs p (LHsExpr p) pattern UnguardedRHSs :: LHsExpr p -> GRHSs p (LHsExpr p)
pattern UnguardedRHSs body <- pattern UnguardedRHSs body <-
GRHSs {grhssGRHSs = [L _ (GRHS _ [] body)]} GRHSs {grhssGRHSs = [L _ (GRHS _ [] body)]}

View File

@ -112,7 +112,7 @@ introduceHypothesis f ns =
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | Introduce bindings in the context of a lamba. -- | Introduce bindings in the context of a lambda.
lambdaHypothesis lambdaHypothesis
:: Maybe OccName -- ^ The name of the top level function. For any other :: Maybe OccName -- ^ The name of the top level function. For any other
-- function, this should be 'Nothing'. -- function, this should be 'Nothing'.
@ -187,7 +187,7 @@ filterPosition defn pos jdg =
findPositionVal :: Judgement' a -> OccName -> Int -> Maybe OccName findPositionVal :: Judgement' a -> OccName -> Int -> Maybe OccName
findPositionVal jdg defn pos = listToMaybe $ do findPositionVal jdg defn pos = listToMaybe $ do
-- It's important to inspect the entire hypothesis here, as we need to trace -- It's important to inspect the entire hypothesis here, as we need to trace
-- ancstry through potentially disallowed terms in the hypothesis. -- ancestry through potentially disallowed terms in the hypothesis.
(name, hi) <- M.toList (name, hi) <- M.toList
$ M.map (overProvenance expandDisallowed) $ M.map (overProvenance expandDisallowed)
$ hyByName $ hyByName
@ -261,7 +261,7 @@ provAncestryOf (DisallowedPrv _ p2) = provAncestryOf p2
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- TODO(sandy): THIS THING IS A BIG BIG HACK -- TODO(sandy): THIS THING IS A BIG BIG HACK
-- --
-- Why? 'ctxDefiningFuncs' is _all_ of the functions currently beind defined -- Why? 'ctxDefiningFuncs' is _all_ of the functions currently being defined
-- (eg, we might be in a where block). The head of this list is not guaranteed -- (eg, we might be in a where block). The head of this list is not guaranteed
-- to be the one we're interested in. -- to be the one we're interested in.
extremelyStupid__definingFunction :: Context -> OccName extremelyStupid__definingFunction :: Context -> OccName
@ -302,7 +302,7 @@ disallowing reason ns =
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | The hypothesis, consisting of local terms and the ambient environment -- | The hypothesis, consisting of local terms and the ambient environment
-- (impors and class methods.) Hides disallowed values. -- (imports and class methods.) Hides disallowed values.
jHypothesis :: Judgement' a -> Hypothesis a jHypothesis :: Judgement' a -> Hypothesis a
jHypothesis jHypothesis
= Hypothesis = Hypothesis

View File

@ -36,7 +36,7 @@ everythingContaining dst f = go
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | Helper function for implementing 'everythingWithin' -- | Helper function for implementing 'everythingWithin'
-- --
-- NOTE(sandy): Subtly broken. In an ideal world, this function shuld return -- NOTE(sandy): Subtly broken. In an ideal world, this function should return
-- @Just False@ for nodes of /any type/ which do not contain the span. But if -- @Just False@ for nodes of /any type/ which do not contain the span. But if
-- this functionality exists anywhere within the SYB machinery, I have yet to -- this functionality exists anywhere within the SYB machinery, I have yet to
-- find it. -- find it.
@ -49,7 +49,7 @@ genericIsSubspan dst = mkQ1 (L noSrcSpan ()) Nothing $ \case
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | Like 'mkQ', but allows for polymorphic instantiation of its specific case. -- | Like 'mkQ', but allows for polymorphic instantiation of its specific case.
-- This instantation matches whenever the dynamic value has the same -- This instantiation matches whenever the dynamic value has the same
-- constructor as the proxy @f ()@ value. -- constructor as the proxy @f ()@ value.
mkQ1 :: forall a r f mkQ1 :: forall a r f
. (Data a, Data (f ())) . (Data a, Data (f ()))

View File

@ -77,7 +77,7 @@ mkSubst :: Set TyVar -> Type -> Type -> TCvSubst
mkSubst skolems a b = mkSubst skolems a b =
let tyvars = S.fromList $ mapMaybe getTyVar_maybe [a, b] let tyvars = S.fromList $ mapMaybe getTyVar_maybe [a, b]
-- If we can unify our skolems, at least one is no longer a skolem. -- If we can unify our skolems, at least one is no longer a skolem.
-- Removing them from this set ensures we can get a subtitution between -- Removing them from this set ensures we can get a substitution between
-- the two. But it's okay to leave them in 'ts_skolems' in general, since -- the two. But it's okay to leave them in 'ts_skolems' in general, since
-- they won't exist after running this substitution. -- they won't exist after running this substitution.
skolems' = skolems S.\\ tyvars skolems' = skolems S.\\ tyvars

View File

@ -44,8 +44,8 @@ deriveFmap = do
-- and then calling mappend recursively. At each recursive call, we filter away -- and then calling mappend recursively. At each recursive call, we filter away
-- any binding that isn't in an analogous position. -- any binding that isn't in an analogous position.
-- --
-- The recursive call first attempts to use an instace in scope. If that fails, -- The recursive call first attempts to use an instance in scope. If that fails,
-- it fals back to trying a theta method from the hypothesis with the correct -- it falls back to trying a theta method from the hypothesis with the correct
-- name. -- name.
deriveMappend :: TacticsM () deriveMappend :: TacticsM ()
deriveMappend = do deriveMappend = do

View File

@ -461,7 +461,7 @@ mkFakeVar = do
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | Construct a fake varible to attach the current 'Provenance' to, and then -- | Construct a fake variable to attach the current 'Provenance' to, and then
-- build a sub-hypothesis for the pattern match. -- build a sub-hypothesis for the pattern match.
mkDerivedConHypothesis mkDerivedConHypothesis
:: Provenance :: Provenance

View File

@ -255,7 +255,7 @@ unify goal inst = do
Nothing -> cut Nothing -> cut
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | Get a substition out of a theta's fundeps -- | Get a substitution out of a theta's fundeps
learnFromFundeps learnFromFundeps
:: ThetaType :: ThetaType
-> RuleM () -> RuleM ()

View File

@ -36,12 +36,12 @@ data Count a where
prettyCount :: Count a -> Doc b prettyCount :: Count a -> Doc b
prettyCount One = "single" prettyCount One = "single"
prettyCount Many = "varadic" prettyCount Many = "variadic"
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | What sorts of arguments does the tactic take? Currently there is no -- | What sorts of arguments does the tactic take? Currently there is no
-- distincion between 'Ref' and 'Bind', other than documentation. -- distinction between 'Ref' and 'Bind', other than documentation.
-- --
-- The type index here is used for the shape of the function the parser should -- The type index here is used for the shape of the function the parser should
-- take. -- take.

View File

@ -33,7 +33,7 @@ pattern Lambda pats body <-
------------------------------------------------------------------------------ ------------------------------------------------------------------------------
-- | Simlify an expression. -- | Simplify an expression.
simplify :: LHsExpr GhcPs -> LHsExpr GhcPs simplify :: LHsExpr GhcPs -> LHsExpr GhcPs
simplify simplify
= (!!3) -- Do three passes; this should be good enough for the limited = (!!3) -- Do three passes; this should be good enough for the limited
@ -65,7 +65,7 @@ simplifyEtaReduce = mkT $ \case
(unsnoc -> Just (pats, VarPat _ (L _ pat))) (unsnoc -> Just (pats, VarPat _ (L _ pat)))
(HsApp _ (L _ f) (L _ (HsVar _ (L _ a)))) (HsApp _ (L _ f) (L _ (HsVar _ (L _ a))))
| pat == a | pat == a
-- We can only perform this simplifiation if @pat@ is otherwise unused. -- We can only perform this simplification if @pat@ is otherwise unused.
, not (containsHsVar pat f) -> , not (containsHsVar pat f) ->
Lambda pats f Lambda pats f
x -> x x -> x
@ -87,7 +87,7 @@ simplifyCompose = mkT $ \case
(unsnoc -> Just (pats, VarPat _ (L _ pat))) (unsnoc -> Just (pats, VarPat _ (L _ pat)))
(unroll -> (fs@(_:_), HsVar _ (L _ a))) (unroll -> (fs@(_:_), HsVar _ (L _ a)))
| pat == a | pat == a
-- We can only perform this simplifiation if @pat@ is otherwise unused. -- We can only perform this simplification if @pat@ is otherwise unused.
, not (containsHsVar pat fs) -> , not (containsHsVar pat fs) ->
Lambda pats (foldr1 (infixCall ".") fs) Lambda pats (foldr1 (infixCall ".") fs)
x -> x x -> x

View File

@ -76,8 +76,8 @@ mkTest
=> String -- ^ The test name => String -- ^ The test name
-> FilePath -- ^ The file name stem (without extension) to load -> FilePath -- ^ The file name stem (without extension) to load
-> Int -- ^ Cursor line -> Int -- ^ Cursor line
-> Int -- ^ Cursor columnn -> Int -- ^ Cursor column
-> t ( Bool -> Bool -- Use 'not' for actions that shouldnt be present -> t ( Bool -> Bool -- Use 'not' for actions that shouldn't be present
, TacticCommand -- An expected command ... , TacticCommand -- An expected command ...
, Text -- ... for this variable , Text -- ... for this variable
) -- ^ A collection of (un)expected code actions. ) -- ^ A collection of (un)expected code actions.

View File

@ -41,7 +41,7 @@
   <experiment>.log - bench stdout    <experiment>.log - bench stdout
   results.csv - results of all the experiments for the example    results.csv - results of all the experiments for the example
results.csv - aggregated results of all the examples, experiments, versions and configurations results.csv - aggregated results of all the examples, experiments, versions and configurations
<experiment>.svg - graph of bytes over elapsed time, for all the examples, experiments, versions and configuratiof <experiment>.svg - graph of bytes over elapsed time, for all the examples, experiments, versions and configurations
For diff graphs, the "previous version" is the preceding entry in the list of versions For diff graphs, the "previous version" is the preceding entry in the list of versions
in the config file. A possible improvement is to obtain this info via `git rev-list`. in the config file. A possible improvement is to obtain this info via `git rev-list`.