Multi Component (#522)

* Multi component support

In this commit we add support for loading multiple components into one
ghcide session.

The current behaviour is that each component is loaded lazily into the
session. When a file from an unrecognised component is loaded, the
cradle is consulted again to get a new set of options for the new
component. This will cause all the currently loaded files to be
reloaded into a new HscEnv which is shared by all the currently known
components. The result of this is that functions such as go-to
definition work between components if they have been loaded into the
same session but you have to open at least one file from each component
before it will work.

Only minimal changes are needed to the internals to ghcide to make the
file searching logic look in include directories for all currently
loaded components. The main changes are in exe/Main.hs which has been
heavily rewritten to avoid shake indirections. A global map is created
which maps a filepath to the HscEnv which should be used to compile it.
When a new component is created this map is completely refreshed so each
path maps to a new

Which paths belong to a componenent is determined by the targets listed
by the cradle. Therefore it is important that each cradle also lists all
the targets for the cradle. There are some other choices here as well
which are less accurate such as mapping via include directories  which
is the aproach that I implemented in haskell-ide-engine.

The commit has been tested so far with cabal and hadrian.

Also deleted the .ghci file which was causing errors during testing and
seemed broken anyway.

Co-authored-by: Alan Zimmerman <alan.zimm@gmail.com>
Co-authored-by: fendor <power.walross@gmail.com>

* Final tweaks?

* Fix 8.4 build

* Add multi-component test

* Fix hlint

* Add cabal to CI images

* Modify path

* Set PATH in the right place (hopefully)

* Always generate interface files and hie files

* Use correct DynFlags in mkImportDirs

You have to use the DynFlags for the file we are currently compiling to
get the right packages in the package db so that lookupPackage doesn't
always fail.

* Revert "Always generate interface files and hie files"

This reverts commit 820aa24189.

* remove traces

* Another test

* lint

* Unset env vars set my stack

* Fix extra-source-files

As usual, stack doesn’t understand Cabal properly and doesn’t seem to
like ** wildcards so I’ve enumerated it manually.

* Unset env locally

Co-authored-by: Alan Zimmerman <alan.zimm@gmail.com>
Co-authored-by: fendor <power.walross@gmail.com>
Co-authored-by: Moritz Kiefer <moritz.kiefer@purelyfunctional.org>
This commit is contained in:
Matthew Pickering 2020-06-02 13:44:16 +01:00 committed by GitHub
parent 51907fe47a
commit 373c4060df
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
30 changed files with 854 additions and 438 deletions

View File

@ -32,7 +32,9 @@ jobs:
./fmt.sh
displayName: "HLint via ./fmt.sh"
- bash: |
sudo apt-get install -y g++ gcc libc6-dev libffi-dev libgmp-dev make zlib1g-dev
sudo add-apt-repository ppa:hvr/ghc
sudo apt-get update
sudo apt-get install -y g++ gcc libc6-dev libffi-dev libgmp-dev make zlib1g-dev cabal-install-3.2
if ! which stack >/dev/null 2>&1; then
curl -sSL https://get.haskellstack.org/ | sh
fi
@ -41,7 +43,9 @@ jobs:
displayName: 'stack setup'
- bash: stack build --only-dependencies --stack-yaml=$STACK_YAML
displayName: 'stack build --only-dependencies'
- bash: stack test --ghc-options=-Werror --stack-yaml=$STACK_YAML || stack test --ghc-options=-Werror --stack-yaml=$STACK_YAML|| stack test --ghc-options=-Werror --stack-yaml=$STACK_YAML
- bash: |
export PATH=/opt/cabal/bin:$PATH
stack test --ghc-options=-Werror --stack-yaml=$STACK_YAML || stack test --ghc-options=-Werror --stack-yaml=$STACK_YAML|| stack test --ghc-options=-Werror --stack-yaml=$STACK_YAML
# ghcide stack tests are flaky, see https://github.com/digital-asset/daml/issues/2606.
displayName: 'stack test --ghc-options=-Werror'
- bash: |

View File

@ -46,6 +46,7 @@ jobs:
# Installing happy and alex standalone to avoid error "strip.exe: unable to rename ../*.exe; reason: File exists"
stack install happy --stack-yaml $STACK_YAML
stack install alex --stack-yaml $STACK_YAML
stack install cabal-install --stack-yaml $STACK_YAML
stack build --only-dependencies --stack-yaml $STACK_YAML
displayName: 'stack build --only-dependencies'
- bash: stack test --no-run-tests --ghc-options=-Werror --stack-yaml $STACK_YAML

25
.ghci
View File

@ -1,25 +0,0 @@
:set -Wunused-binds -Wunused-imports -Worphans -Wunused-matches -Wincomplete-patterns
:set -XBangPatterns
:set -XDeriveFunctor
:set -XDeriveGeneric
:set -XGeneralizedNewtypeDeriving
:set -XLambdaCase
:set -XNamedFieldPuns
:set -XOverloadedStrings
:set -XRecordWildCards
:set -XScopedTypeVariables
:set -XStandaloneDeriving
:set -XTupleSections
:set -XTypeApplications
:set -XViewPatterns
:set -package=ghc
:set -ignore-package=ghc-lib-parser
:set -DGHC_STABLE
:set -Iinclude
:set -idist/build/autogen
:set -isrc
:set -iexe
:load Main

View File

@ -25,6 +25,23 @@ There are more details about our approach [in this blog post](https://4ta.uk/p/s
| Display type and source module of values | hover |
| Remove redundant imports, replace suggested typos for values and module imports, fill type holes, insert missing type signatures, add suggested ghc extensions | codeAction (quickfix) |
## Limitations to Multi-Component support
`ghcide` supports loading multiple components into the same session so that
features such as go-to definition work across components. However, there are
some limitations to this.
1. You will get much better results currently manually specifying the hie.yaml file.
Until tools like cabal and stack provide the right interface to support multi-component
projects, it is always advised to specify explicitly how your project partitions.
2. Cross-component features only work if you have loaded at least one file
from each component.
3. There is a known issue where if you have three components, such that A depends on B which depends on C
then if you load A and C into the session but not B then under certain situations you
can get strange errors about a type coming from two different places. See [this repo](https://github.com/fendor/ghcide-bad-interface-files) for
a simple reproduction of the bug.
## Using it
### Install `ghcide`
@ -308,7 +325,7 @@ Now opening a `.hs` file should work with `ghcide`.
## History and relationship to other Haskell IDE's
The teams behind this project and the [`haskell-ide-engine`](https://github.com/haskell/haskell-ide-engine#readme) have agreed to join forces under the [`haskell-language-server` project](https://github.com/haskell/haskell-language-server), see the [original announcement](https://neilmitchell.blogspot.com/2020/01/one-haskell-ide-to-rule-them-all.html). The technical work is ongoing, with the likely model being that this project serves as the core, while plugins and integrations are kept in the [`haskell-language-server` project](https://github.com/haskell/haskell-language-server).
The teams behind this project and the [`haskell-ide-engine`](https://github.com/haskell/haskell-ide-engine#readme) have agreed to join forces under the [`haskell-language-server` project](https://github.com/haskell/haskell-language-server), see the [original announcement](https://neilmitchell.blogspot.com/2020/01/one-haskell-ide-to-rule-them-all.html). The technical work is ongoing, with the likely model being that this project serves as the core, while plugins and integrations are kept in the [`haskell-language-server` project](https://github.com/haskell/haskell-language-server).
The code behind `ghcide` was originally developed by [Digital Asset](https://digitalasset.com/) as part of the [DAML programming language](https://github.com/digital-asset/daml). DAML is a smart contract language targeting distributed-ledger runtimes, based on [GHC](https://www.haskell.org/ghc/) with custom language extensions. The DAML programming language has [an IDE](https://webide.daml.com/), and work was done to separate off a reusable Haskell-only IDE (what is now `ghcide`) which the [DAML IDE then builds upon](https://github.com/digital-asset/daml/tree/master/compiler/damlc). Since that time, there have been various [non-Digital Asset contributors](https://github.com/digital-asset/ghcide/graphs/contributors), in addition to continued investment by Digital Asset. All contributions require a [Contributor License Agreement](https://cla.digitalasset.com/digital-asset/ghcide) that states you license the code under the [Apache License](LICENSE).

View File

@ -7,16 +7,26 @@
module Main(main) where
import Linker (initDynLinker)
import Data.IORef
import NameCache
import Packages
import Module
import Arguments
import Data.Maybe
import Data.List.Extra
import System.FilePath
import Control.Concurrent.Async
import Control.Concurrent.Extra
import Control.Exception
import Control.Monad.Extra
import Control.Monad.IO.Class
import Data.Default
import System.Time.Extra
import Data.Either
import Data.Function
import Data.List.Extra
import Data.Maybe
import qualified Data.Text as T
import qualified Data.Text.IO as T
import Data.Time.Clock (UTCTime)
import Data.Version
import Development.IDE.Core.Debouncer
import Development.IDE.Core.FileStore
import Development.IDE.Core.OfInterest
@ -33,25 +43,39 @@ import Development.IDE.GHC.Util
import Development.IDE.Plugin
import Development.IDE.Plugin.Completions as Completions
import Development.IDE.Plugin.CodeAction as CodeAction
import qualified Data.Text as T
import qualified Data.Text.IO as T
import qualified Language.Haskell.LSP.Core as LSP
import Language.Haskell.LSP.Messages
import Language.Haskell.LSP.Types (LspId(IdInt))
import Data.Version
import Language.Haskell.LSP.Types
import Data.Aeson (ToJSON(toJSON))
import Development.IDE.LSP.LanguageServer
import qualified System.Directory.Extra as IO
import System.Environment
import System.IO
import System.Exit
import System.FilePath
import System.Directory
import System.Time.Extra
import HIE.Bios.Environment (addCmdOpts, makeDynFlagsAbsolute)
import Paths_ghcide
import Development.GitRev
import Development.Shake (Action, Rules, action)
import Development.Shake (Action, action)
import qualified Data.HashSet as HashSet
import qualified Data.HashMap.Strict as HM
import qualified Data.Map.Strict as Map
import HIE.Bios
import Rules
import RuleTypes
import qualified Crypto.Hash.SHA1 as H
import qualified Data.ByteString.Char8 as B
import Data.ByteString.Base16 (encode)
import DynFlags (gopt_set, gopt_unset, updOptLevel, PackageFlag(..), PackageArg(..))
import GhcMonad
import HscTypes (HscEnv(..), ic_dflags)
import GHC hiding (def)
import GHC.Check ( VersionCheck(..), makeGhcVersionChecker )
import Data.Either.Extra
import HIE.Bios.Cradle
import HIE.Bios.Types
import Utils
ghcideVersion :: IO String
ghcideVersion = do
@ -97,14 +121,14 @@ main = do
runLanguageServer options (pluginHandler plugins) onInitialConfiguration onConfigurationChange $ \getLspId event vfs caps -> do
t <- t
hPutStrLn stderr $ "Started LSP server in " ++ showDuration t
let options = (defaultIdeOptions $ loadSession dir)
let options = (defaultIdeOptions $ loadSessionShake dir)
{ optReportProgress = clientSupportsProgress caps
, optShakeProfiling = argsShakeProfiling
, optTesting = argsTesting
, optThreads = argsThreads
}
debouncer <- newAsyncDebouncer
initialise caps (cradleRules >> mainRule >> pluginRules plugins >> action kick)
initialise caps (mainRule >> pluginRules plugins >> action kick)
getLspId event (logger minBound) debouncer options vfs
else do
-- GHC produces messages with UTF8 in them, so make sure the terminal doesn't error
@ -114,55 +138,32 @@ main = do
putStrLn $ "Ghcide setup tester in " ++ dir ++ "."
putStrLn "Report bugs at https://github.com/digital-asset/ghcide/issues"
putStrLn $ "\nStep 1/6: Finding files to test in " ++ dir
putStrLn $ "\nStep 1/4: Finding files to test in " ++ dir
files <- expandFiles (argFiles ++ ["." | null argFiles])
-- LSP works with absolute file paths, so try and behave similarly
files <- nubOrd <$> mapM IO.canonicalizePath files
putStrLn $ "Found " ++ show (length files) ++ " files"
putStrLn "\nStep 2/6: Looking for hie.yaml files that control setup"
putStrLn "\nStep 2/4: Looking for hie.yaml files that control setup"
cradles <- mapM findCradle files
let ucradles = nubOrd cradles
let n = length ucradles
putStrLn $ "Found " ++ show n ++ " cradle" ++ ['s' | n /= 1]
sessions <- forM (zipFrom (1 :: Int) ucradles) $ \(i, x) -> do
let msg = maybe ("Implicit cradle for " ++ dir) ("Loading " ++) x
putStrLn $ "\nStep 3/6, Cradle " ++ show i ++ "/" ++ show n ++ ": " ++ msg
cradle <- maybe (loadImplicitCradle $ addTrailingPathSeparator dir) loadCradle x
when (isNothing x) $ print cradle
putStrLn $ "\nStep 4/6, Cradle " ++ show i ++ "/" ++ show n ++ ": Loading GHC Session"
opts <- getComponentOptions cradle
createSession opts
putStrLn "\nStep 5/6: Initializing the IDE"
putStrLn "\nStep 3/4: Initializing the IDE"
vfs <- makeVFSHandle
let cradlesToSessions = Map.fromList $ zip ucradles sessions
let filesToCradles = Map.fromList $ zip files cradles
let grab file = fromMaybe (head sessions) $ do
cradle <- Map.lookup file filesToCradles
Map.lookup cradle cradlesToSessions
debouncer <- newAsyncDebouncer
ide <- initialise def mainRule (pure $ IdInt 0) (showEvent lock) (logger Info) debouncer (defaultIdeOptions $ loadSessionShake dir) vfs
let options =
(defaultIdeOptions $ return $ return . grab)
{ optShakeProfiling = argsShakeProfiling }
ide <- initialise def (cradleRules >> mainRule) (pure $ IdInt 0) (showEvent lock) (logger Info) noopDebouncer options vfs
putStrLn "\nStep 6/6: Type checking the files"
putStrLn "\nStep 4/4: Type checking the files"
setFilesOfInterest ide $ HashSet.fromList $ map toNormalizedFilePath' files
results <- runActionSync ide $ uses TypeCheck $ map toNormalizedFilePath' files
results <- runActionSync ide $ uses TypeCheck (map toNormalizedFilePath' files)
let (worked, failed) = partition fst $ zip (map isJust results) files
when (failed /= []) $
putStr $ unlines $ "Files that failed:" : map ((++) " * " . snd) failed
let files xs = let n = length xs in if n == 1 then "1 file" else show n ++ " files"
putStrLn $ "\nCompleted (" ++ files worked ++ " worked, " ++ files failed ++ " failed)"
unless (null failed) exitFailure
cradleRules :: Rules ()
cradleRules = do
loadGhcSession
cradleToSession
return ()
expandFiles :: [FilePath] -> IO [FilePath]
expandFiles = concatMapM $ \x -> do
@ -189,22 +190,404 @@ showEvent lock (EventFileDiagnostics (toNormalizedFilePath' -> file) diags) =
withLock lock $ T.putStrLn $ showDiagnosticsColored $ map (file,ShowDiag,) diags
showEvent lock e = withLock lock $ print e
loadSession :: FilePath -> Action (FilePath -> Action HscEnvEq)
loadSession dir = liftIO $ do
cradleLoc <- memoIO $ \v -> do
res <- findCradle v
-- Sometimes we get C:, sometimes we get c:, and sometimes we get a relative path
-- try and normalise that
-- e.g. see https://github.com/digital-asset/ghcide/issues/126
res' <- traverse IO.makeAbsolute res
return $ normalise <$> res'
let session :: Maybe FilePath -> Action HscEnvEq
session file = do
-- In the absence of a cradle file, just pass the directory from where to calculate an implicit cradle
let cradle = toNormalizedFilePath' $ fromMaybe dir file
use_ LoadCradle cradle
return $ \file -> session =<< liftIO (cradleLoc file)
-- | Run the specific cradle on a specific FilePath via hie-bios.
cradleToSessionOpts :: Cradle a -> FilePath -> IO (Either [CradleError] ComponentOptions)
cradleToSessionOpts cradle file = do
let showLine s = putStrLn ("> " ++ s)
cradleRes <- runCradle (cradleOptsProg cradle) showLine file
case cradleRes of
CradleSuccess r -> pure (Right r)
CradleFail err -> return (Left [err])
-- For the None cradle perhaps we still want to report an Info
-- message about the fact that the file is being ignored.
CradleNone -> return (Left [])
emptyHscEnv :: IO HscEnv
emptyHscEnv = do
libdir <- getLibdir
env <- runGhc (Just libdir) getSession
initDynLinker env
pure env
-- | Convert a target to a list of potential absolute paths.
-- A TargetModule can be anywhere listed by the supplied include
-- directories
-- A target file is a relative path but with a specific prefix so just need
-- to canonicalise it.
targetToFile :: [FilePath] -> TargetId -> IO [NormalizedFilePath]
targetToFile is (TargetModule mod) = do
let fps = [i </> moduleNameSlashes mod -<.> ext | ext <- exts, i <- is ]
exts = ["hs", "hs-boot", "lhs"]
mapM (fmap toNormalizedFilePath' . canonicalizePath) fps
targetToFile _ (TargetFile f _) = do
f' <- canonicalizePath f
return [toNormalizedFilePath' f']
setNameCache :: IORef NameCache -> HscEnv -> HscEnv
setNameCache nc hsc = hsc { hsc_NC = nc }
loadSessionShake :: FilePath -> Action (FilePath -> Action (IdeResult HscEnvEq))
loadSessionShake fp = do
se <- getShakeExtras
IdeOptions{optTesting} <- getIdeOptions
res <- liftIO $ loadSession optTesting se fp
return (fmap liftIO res)
-- | This is the key function which implements multi-component support. All
-- components mapping to the same hie.yaml file are mapped to the same
-- HscEnv which is updated as new components are discovered.
loadSession :: Bool -> ShakeExtras -> FilePath -> IO (FilePath -> IO (IdeResult HscEnvEq))
loadSession optTesting ShakeExtras{logger, eventer} dir = do
-- Mapping from hie.yaml file to HscEnv, one per hie.yaml file
hscEnvs <- newVar Map.empty :: IO (Var HieMap)
-- Mapping from a Filepath to HscEnv
fileToFlags <- newVar Map.empty :: IO (Var FlagsMap)
-- This caches the mapping from Mod.hs -> hie.yaml
cradleLoc <- memoIO $ \v -> do
res <- findCradle v
-- Sometimes we get C:, sometimes we get c:, and sometimes we get a relative path
-- try and normalise that
-- e.g. see https://github.com/digital-asset/ghcide/issues/126
res' <- traverse IO.makeAbsolute res
return $ normalise <$> res'
-- Create a new HscEnv from a hieYaml root and a set of options
-- If the hieYaml file already has an HscEnv, the new component is
-- combined with the components in the old HscEnv into a new HscEnv
-- which contains the union.
let packageSetup :: (Maybe FilePath, NormalizedFilePath, ComponentOptions)
-> IO (HscEnv, ComponentInfo, [ComponentInfo])
packageSetup (hieYaml, cfp, opts) = do
-- Parse DynFlags for the newly discovered component
hscEnv <- emptyHscEnv
(df, targets) <- evalGhcEnv hscEnv $
setOptions opts (hsc_dflags hscEnv)
dep_info <- getDependencyInfo (componentDependencies opts ++ maybeToList hieYaml)
-- Now lookup to see whether we are combining with an existing HscEnv
-- or making a new one. The lookup returns the HscEnv and a list of
-- information about other components loaded into the HscEnv
-- (unitId, DynFlag, Targets)
modifyVar hscEnvs $ \m -> do
-- Just deps if there's already an HscEnv
-- Nothing is it's the first time we are making an HscEnv
let oldDeps = Map.lookup hieYaml m
let -- Add the raw information about this component to the list
-- We will modify the unitId and DynFlags used for
-- compilation but these are the true source of
-- information.
new_deps = RawComponentInfo (thisInstalledUnitId df) df targets cfp opts dep_info
: maybe [] snd oldDeps
-- Get all the unit-ids for things in this component
inplace = map rawComponentUnitId new_deps
new_deps' <- forM new_deps $ \RawComponentInfo{..} -> do
-- Remove all inplace dependencies from package flags for
-- components in this HscEnv
let (df2, uids) = removeInplacePackages inplace rawComponentDynFlags
let prefix = show rawComponentUnitId
-- See Note [Avoiding bad interface files]
processed_df <- setCacheDir logger prefix (sort $ map show uids) opts df2
-- The final component information, mostly the same but the DynFlags don't
-- contain any packages which are also loaded
-- into the same component.
pure $ ComponentInfo rawComponentUnitId
processed_df
uids
rawComponentTargets
rawComponentFP
rawComponentCOptions
rawComponentDependencyInfo
-- Make a new HscEnv, we have to recompile everything from
-- scratch again (for now)
-- It's important to keep the same NameCache though for reasons
-- that I do not fully understand
logInfo logger (T.pack ("Making new HscEnv" ++ show inplace))
hscEnv <- case oldDeps of
Nothing -> emptyHscEnv
Just (old_hsc, _) -> setNameCache (hsc_NC old_hsc) <$> emptyHscEnv
newHscEnv <-
-- Add the options for the current component to the HscEnv
evalGhcEnv hscEnv $ do
_ <- setSessionDynFlags df
getSession
-- Modify the map so the hieYaml now maps to the newly created
-- HscEnv
-- Returns
-- . the new HscEnv so it can be used to modify the
-- FilePath -> HscEnv map (fileToFlags)
-- . The information for the new component which caused this cache miss
-- . The modified information (without -inplace flags) for
-- existing packages
pure (Map.insert hieYaml (newHscEnv, new_deps) m, (newHscEnv, head new_deps', tail new_deps'))
let session :: (Maybe FilePath, NormalizedFilePath, ComponentOptions) -> IO (IdeResult HscEnvEq)
session (hieYaml, cfp, opts) = do
(hscEnv, new, old_deps) <- packageSetup (hieYaml, cfp, opts)
-- Make a map from unit-id to DynFlags, this is used when trying to
-- resolve imports. (especially PackageImports)
let uids = map (\ci -> (componentUnitId ci, componentDynFlags ci)) (new : old_deps)
-- For each component, now make a new HscEnvEq which contains the
-- HscEnv for the hie.yaml file but the DynFlags for that component
-- New HscEnv for the component in question, returns the new HscEnvEq and
-- a mapping from FilePath to the newly created HscEnvEq.
let new_cache = newComponentCache logger hscEnv uids
(cs, res) <- new_cache new
-- Modified cache targets for everything else in the hie.yaml file
-- which now uses the same EPS and so on
cached_targets <- concatMapM (fmap fst . new_cache) old_deps
modifyVar_ fileToFlags $ \var -> do
pure $ Map.insert hieYaml (HM.fromList (cs ++ cached_targets)) var
return (fst res)
let consultCradle :: Maybe FilePath -> FilePath -> IO (IdeResult HscEnvEq)
consultCradle hieYaml cfp = do
when optTesting $ eventer $ notifyCradleLoaded cfp
logInfo logger $ T.pack ("Consulting the cradle for " <> show cfp)
cradle <- maybe (loadImplicitCradle $ addTrailingPathSeparator dir) loadCradle hieYaml
eopts <- cradleToSessionOpts cradle cfp
logDebug logger $ T.pack ("Session loading result: " <> show eopts)
case eopts of
-- The cradle gave us some options so get to work turning them
-- into and HscEnv.
Right opts -> do
session (hieYaml, toNormalizedFilePath' cfp, opts)
-- Failure case, either a cradle error or the none cradle
Left err -> do
dep_info <- getDependencyInfo (maybeToList hieYaml)
let ncfp = toNormalizedFilePath' cfp
let res = (map (renderCradleError ncfp) err, Nothing)
modifyVar_ fileToFlags $ \var -> do
pure $ Map.insertWith HM.union hieYaml (HM.singleton ncfp (res, dep_info)) var
return res
-- This caches the mapping from hie.yaml + Mod.hs -> [String]
let sessionOpts :: (Maybe FilePath, FilePath) -> IO (IdeResult HscEnvEq)
sessionOpts (hieYaml, file) = do
v <- fromMaybe HM.empty . Map.lookup hieYaml <$> readVar fileToFlags
cfp <- canonicalizePath file
case HM.lookup (toNormalizedFilePath' cfp) v of
Just (opts, old_di) -> do
deps_ok <- checkDependencyInfo old_di
if not deps_ok
then do
-- If the dependencies are out of date then clear both caches and start
-- again.
modifyVar_ fileToFlags (const (return Map.empty))
-- Keep the same name cache
modifyVar_ hscEnvs (return . Map.adjust (\(h, _) -> (h, [])) hieYaml )
consultCradle hieYaml cfp
else return opts
Nothing -> consultCradle hieYaml cfp
dummyAs <- async $ return (error "Uninitialised")
runningCradle <- newVar dummyAs :: IO (Var (Async (IdeResult HscEnvEq)))
-- The main function which gets options for a file. We only want one of these running
-- at a time. Therefore the IORef contains the currently running cradle, if we try
-- to get some more options then we wait for the currently running action to finish
-- before attempting to do so.
let getOptions :: FilePath -> IO (IdeResult HscEnvEq)
getOptions file = do
hieYaml <- cradleLoc file
sessionOpts (hieYaml, file)
return $ \file -> do
join $ mask_ $ modifyVar runningCradle $ \as -> do
-- If the cradle is not finished, then wait for it to finish.
void $ wait as
as <- async $ getOptions file
return (as, wait as)
-- | Create a mapping from FilePaths to HscEnvEqs
newComponentCache
:: Logger
-> HscEnv
-> [(InstalledUnitId, DynFlags)]
-> ComponentInfo
-> IO ([(NormalizedFilePath, (IdeResult HscEnvEq, DependencyInfo))], (IdeResult HscEnvEq, DependencyInfo))
newComponentCache logger hsc_env uids ci = do
let df = componentDynFlags ci
let hscEnv' = hsc_env { hsc_dflags = df
, hsc_IC = (hsc_IC hsc_env) { ic_dflags = df } }
versionMismatch <- checkGhcVersion
henv <- case versionMismatch of
Just mismatch -> return mismatch
Nothing -> newHscEnvEq hscEnv' uids
let res = (([], Just henv), componentDependencyInfo ci)
logDebug logger ("New Component Cache HscEnvEq: " <> T.pack (show res))
let is = importPaths df
ctargets <- concatMapM (targetToFile is . targetId) (componentTargets ci)
-- A special target for the file which caused this wonderful
-- component to be created. In case the cradle doesn't list all the targets for
-- the component, in which case things will be horribly broken anyway.
-- Otherwise, we will immediately attempt to reload this module which
-- causes an infinite loop and high CPU usage.
let special_target = (componentFP ci, res)
let xs = map (,res) ctargets
return (special_target:xs, res)
{- Note [Avoiding bad interface files]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Originally, we set the cache directory for the various components once
on the first occurrence of the component.
This works fine if these components have no references to each other,
but you have components that depend on each other, the interface files are
updated for each component.
After restarting the session and only opening the component that depended
on the other, suddenly the interface files of this component are stale.
However, from the point of view of `ghcide`, they do not look stale,
thus, not regenerated and the IDE shows weird errors such as:
```
typecheckIface
Declaration for Rep_ClientRunFlags
Axiom branches Rep_ClientRunFlags:
Failed to load interface for Distribution.Simple.Flag
Use -v to see a list of the files searched for.
```
and
```
expectJust checkFamInstConsistency
CallStack (from HasCallStack):
error, called at compiler\\utils\\Maybes.hs:55:27 in ghc:Maybes
expectJust, called at compiler\\typecheck\\FamInst.hs:461:30 in ghc:FamInst
```
To mitigate this, we set the cache directory for each component dependent
on the components of the current `HscEnv`, additionally to the component options
of the respective components.
Assume two components, c1, c2, where c2 depends on c1, and the options of the
respective components are co1, co2.
If we want to load component c2, followed by c1, we set the cache directory for
each component in this way:
* Load component c2
* (Cache Directory State)
- name of c2 + co2
* Load component c1
* (Cache Directory State)
- name of c2 + name of c1 + co2
- name of c2 + name of c1 + co1
Overall, we created three cache directories. If we opened c1 first, then we
create a fourth cache directory.
This makes sure that interface files are always correctly updated.
Since this causes a lot of recompilation, we only update the cache-directory,
if the dependencies of a component have really changed.
E.g. when you load two executables, they can not depend on each other. They
should be filtered out, such that we dont have to re-compile everything.
-}
-- | Set the cache-directory based on the ComponentOptions and a list of
-- internal packages.
-- For the exact reason, see Note [Avoiding bad interface files].
setCacheDir :: MonadIO m => Logger -> String -> [String] -> ComponentOptions -> DynFlags -> m DynFlags
setCacheDir logger prefix hscComponents comps dflags = do
cacheDir <- liftIO $ getCacheDir prefix (hscComponents ++ componentOptions comps)
liftIO $ logInfo logger $ "Using interface files cache dir: " <> T.pack cacheDir
pure $ dflags
& setHiDir cacheDir
& setDefaultHieDir cacheDir
renderCradleError :: NormalizedFilePath -> CradleError -> FileDiagnostic
renderCradleError nfp (CradleError _ec t) =
ideErrorText nfp (T.unlines (map T.pack t))
-- See Note [Multi Cradle Dependency Info]
type DependencyInfo = Map.Map FilePath (Maybe UTCTime)
type HieMap = Map.Map (Maybe FilePath) (HscEnv, [RawComponentInfo])
type FlagsMap = Map.Map (Maybe FilePath) (HM.HashMap NormalizedFilePath (IdeResult HscEnvEq, DependencyInfo))
-- This is pristine information about a component
data RawComponentInfo = RawComponentInfo
{ rawComponentUnitId :: InstalledUnitId
-- | Unprocessed DynFlags. Contains inplace packages such as libraries.
-- We do not want to use them unprocessed.
, rawComponentDynFlags :: DynFlags
-- | All targets of this components.
, rawComponentTargets :: [Target]
-- | Filepath which caused the creation of this component
, rawComponentFP :: NormalizedFilePath
-- | Component Options used to load the component.
, rawComponentCOptions :: ComponentOptions
-- | Maps cradle dependencies, such as `stack.yaml`, or `.cabal` file
-- to last modification time. See Note [Multi Cradle Dependency Info].
, rawComponentDependencyInfo :: DependencyInfo
}
-- This is processed information about the component, in particular the dynflags will be modified.
data ComponentInfo = ComponentInfo
{ componentUnitId :: InstalledUnitId
-- | Processed DynFlags. Does not contain inplace packages such as local
-- libraries. Can be used to actually load this Component.
, componentDynFlags :: DynFlags
-- | Internal units, such as local libraries, that this component
-- is loaded with. These have been extracted from the original
-- ComponentOptions.
, componentInternalUnits :: [InstalledUnitId]
-- | All targets of this components.
, componentTargets :: [Target]
-- | Filepath which caused the creation of this component
, componentFP :: NormalizedFilePath
-- | Component Options used to load the component.
, componentCOptions :: ComponentOptions
-- | Maps cradle dependencies, such as `stack.yaml`, or `.cabal` file
-- to last modification time. See Note [Multi Cradle Dependency Info]
, componentDependencyInfo :: DependencyInfo
}
-- | Check if any dependency has been modified lately.
checkDependencyInfo :: DependencyInfo -> IO Bool
checkDependencyInfo old_di = do
di <- getDependencyInfo (Map.keys old_di)
return (di == old_di)
-- Note [Multi Cradle Dependency Info]
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-- Why do we implement our own file modification tracking here?
-- The primary reason is that the custom caching logic is quite complicated and going into shake
-- adds even more complexity and more indirection. I did try for about 5 hours to work out how to
-- use shake rules rather than IO but eventually gave up.
-- | Computes a mapping from a filepath to its latest modification date.
-- See Note [Multi Cradle Dependency Info] why we do this ourselves instead
-- of letting shake take care of it.
getDependencyInfo :: [FilePath] -> IO DependencyInfo
getDependencyInfo fs = Map.fromList <$> mapM do_one fs
where
tryIO :: IO a -> IO (Either IOException a)
tryIO = try
do_one :: FilePath -> IO (FilePath, Maybe UTCTime)
do_one fp = (fp,) . eitherToMaybe <$> tryIO (getModificationTime fp)
-- | This function removes all the -package flags which refer to packages we
-- are going to deal with ourselves. For example, if a executable depends
-- on a library component, then this function will remove the library flag
-- from the package flags for the executable
--
-- There are several places in GHC (for example the call to hptInstances in
-- tcRnImports) which assume that all modules in the HPT have the same unit
-- ID. Therefore we create a fake one and give them all the same unit id.
removeInplacePackages :: [InstalledUnitId] -> DynFlags -> (DynFlags, [InstalledUnitId])
removeInplacePackages us df = (df { packageFlags = ps
, thisInstalledUnitId = fake_uid }, uids)
where
(uids, ps) = partitionEithers (map go (packageFlags df))
fake_uid = toInstalledUnitId (stringToUnitId "fake_uid")
go p@(ExposePackage _ (UnitIdArg u) _) = if toInstalledUnitId u `elem` us
then Left (toInstalledUnitId u)
else Right p
go p = Right p
-- | Memoize an IO function, with the characteristics:
--
@ -222,3 +605,79 @@ memoIO op = do
res <- onceFork $ op k
return (Map.insert k res mp, res)
Just res -> return (mp, res)
setOptions :: GhcMonad m => ComponentOptions -> DynFlags -> m (DynFlags, [Target])
setOptions (ComponentOptions theOpts compRoot _) dflags = do
(dflags', targets) <- addCmdOpts theOpts dflags
let dflags'' =
-- disabled, generated directly by ghcide instead
flip gopt_unset Opt_WriteInterface $
-- disabled, generated directly by ghcide instead
-- also, it can confuse the interface stale check
dontWriteHieFiles $
setIgnoreInterfacePragmas $
setLinkerOptions $
disableOptimisation $
makeDynFlagsAbsolute compRoot dflags'
-- initPackages parses the -package flags and
-- sets up the visibility for each component.
(final_df, _) <- liftIO $ initPackages dflags''
return (final_df, targets)
-- we don't want to generate object code so we compile to bytecode
-- (HscInterpreted) which implies LinkInMemory
-- HscInterpreted
setLinkerOptions :: DynFlags -> DynFlags
setLinkerOptions df = df {
ghcLink = LinkInMemory
, hscTarget = HscNothing
, ghcMode = CompManager
}
setIgnoreInterfacePragmas :: DynFlags -> DynFlags
setIgnoreInterfacePragmas df =
gopt_set (gopt_set df Opt_IgnoreInterfacePragmas) Opt_IgnoreOptimChanges
disableOptimisation :: DynFlags -> DynFlags
disableOptimisation df = updOptLevel 0 df
setHiDir :: FilePath -> DynFlags -> DynFlags
setHiDir f d =
-- override user settings to avoid conflicts leading to recompilation
d { hiDir = Just f}
getCacheDir :: String -> [String] -> IO FilePath
getCacheDir prefix opts = IO.getXdgDirectory IO.XdgCache (cacheDir </> prefix ++ "-" ++ opts_hash)
where
-- Create a unique folder per set of different GHC options, assuming that each different set of
-- GHC options will create incompatible interface files.
opts_hash = B.unpack $ encode $ H.finalize $ H.updates H.init (map B.pack opts)
-- | Sub directory for the cache path
cacheDir :: String
cacheDir = "ghcide"
notifyCradleLoaded :: FilePath -> FromServerMessage
notifyCradleLoaded fp =
NotCustomServer $
NotificationMessage "2.0" (CustomServerMethod cradleLoadedMethod) $
toJSON fp
cradleLoadedMethod :: T.Text
cradleLoadedMethod = "ghcide/cradle/loaded"
ghcVersionChecker :: IO VersionCheck
ghcVersionChecker = $$(makeGhcVersionChecker (pure <$> getLibdir))
checkGhcVersion :: IO (Maybe HscEnvEq)
checkGhcVersion = do
res <- ghcVersionChecker
case res of
Failure err -> do
putStrLn $ "Error while checking GHC version: " ++ show err
return Nothing
Mismatch {..} ->
return $ Just GhcVersionMismatch {..}
_ ->
return Nothing

View File

@ -1,34 +0,0 @@
{-# LANGUAGE TypeFamilies #-}
module RuleTypes (GetHscEnv(..), LoadCradle(..)) where
import Control.DeepSeq
import Data.Binary
import Data.Hashable (Hashable)
import Development.Shake
import Development.IDE.GHC.Util
import Data.Typeable (Typeable)
import GHC.Generics (Generic)
-- Rule type for caching GHC sessions.
type instance RuleResult GetHscEnv = HscEnvEq
data GetHscEnv = GetHscEnv
{ hscenvOptions :: [String] -- componentOptions from hie-bios
, hscenvRoot :: FilePath -- componentRoot from hie-bios
, hscenvDependencies :: [FilePath] -- componentDependencies from hie-bios
}
deriving (Eq, Show, Typeable, Generic)
instance Hashable GetHscEnv
instance NFData GetHscEnv
instance Binary GetHscEnv
-- Rule type for caching cradle loading
type instance RuleResult LoadCradle = HscEnvEq
data LoadCradle = LoadCradle
deriving (Eq, Show, Typeable, Generic)
instance Hashable LoadCradle
instance NFData LoadCradle
instance Binary LoadCradle

View File

@ -1,147 +0,0 @@
{-# LANGUAGE TemplateHaskell #-}
module Rules
( loadGhcSession
, cradleToSession
, cradleLoadedMethod
, createSession
, getComponentOptions
)
where
import Control.Exception
import Control.Monad (filterM, when)
import qualified Crypto.Hash.SHA1 as H
import Data.ByteString.Base16 (encode)
import qualified Data.ByteString.Char8 as B
import Data.Functor ((<&>))
import Data.Text (Text, pack)
import Development.IDE.Core.Rules (defineNoFile)
import Development.IDE.Core.Service (getIdeOptions)
import Development.IDE.Core.Shake (actionLogger, sendEvent, define, useNoFile_)
import Development.IDE.GHC.Util
import Development.IDE.Types.Location (fromNormalizedFilePath)
import Development.IDE.Types.Options (IdeOptions(IdeOptions, optTesting))
import Development.Shake
import GHC
import GHC.Check (VersionCheck(..), makeGhcVersionChecker)
import HIE.Bios
import HIE.Bios.Cradle
import HIE.Bios.Environment (addCmdOpts, makeDynFlagsAbsolute)
import HIE.Bios.Types
import Linker (initDynLinker)
import RuleTypes
import qualified System.Directory.Extra as IO
import System.FilePath.Posix (addTrailingPathSeparator,
(</>))
import qualified Language.Haskell.LSP.Messages as LSP
import qualified Language.Haskell.LSP.Types as LSP
import Data.Aeson (ToJSON(toJSON))
import Development.IDE.Types.Logger (logDebug)
import Util
import System.IO (hPutStrLn, stderr)
-- Prefix for the cache path
cacheDir :: String
cacheDir = "ghcide"
notifyCradleLoaded :: FilePath -> LSP.FromServerMessage
notifyCradleLoaded fp =
LSP.NotCustomServer $
LSP.NotificationMessage "2.0" (LSP.CustomServerMethod cradleLoadedMethod) $
toJSON fp
loadGhcSession :: Rules ()
loadGhcSession =
-- This rule is for caching the GHC session. E.g., even when the cabal file
-- changed, if the resulting flags did not change, we would continue to use
-- the existing session.
defineNoFile $ \(GetHscEnv opts optRoot deps) ->
liftIO $ createSession $ ComponentOptions opts optRoot deps
cradleToSession :: Rules ()
cradleToSession = define $ \LoadCradle nfp -> do
let f = fromNormalizedFilePath nfp
IdeOptions{optTesting} <- getIdeOptions
logger <- actionLogger
liftIO $ logDebug logger $ "Running cradle " <> pack (fromNormalizedFilePath nfp)
-- If the path points to a directory, load the implicit cradle
mbYaml <- doesDirectoryExist f <&> \isDir -> if isDir then Nothing else Just f
cradle <- liftIO $ maybe (loadImplicitCradle $ addTrailingPathSeparator f) loadCradle mbYaml
when optTesting $
sendEvent $ notifyCradleLoaded f
-- Avoid interrupting `getComponentOptions` since it calls external processes
cmpOpts <- liftIO $ mask $ \_ -> getComponentOptions cradle
let opts = componentOptions cmpOpts
deps = componentDependencies cmpOpts
root = componentRoot cmpOpts
deps' = case mbYaml of
-- For direct cradles, the hie.yaml file itself must be watched.
Just yaml | isDirectCradle cradle -> yaml : deps
_ -> deps
existingDeps <- filterM doesFileExist deps'
need existingDeps
([],) . pure <$> useNoFile_ (GetHscEnv opts root deps)
cradleLoadedMethod :: Text
cradleLoadedMethod = "ghcide/cradle/loaded"
getComponentOptions :: Cradle a -> IO ComponentOptions
getComponentOptions cradle = do
let showLine s = putStrLn ("> " ++ s)
-- WARNING 'runCradle is very expensive and must be called as few times as possible
cradleRes <- runCradle (cradleOptsProg cradle) showLine ""
case cradleRes of
CradleSuccess r -> pure r
CradleFail err -> throwIO err
-- TODO Rather than failing here, we should ignore any files that use this cradle.
-- That will require some more changes.
CradleNone -> fail "'none' cradle is not yet supported"
ghcVersionChecker :: IO VersionCheck
ghcVersionChecker = $$(makeGhcVersionChecker (pure <$> getLibdir))
checkGhcVersion :: IO (Maybe HscEnvEq)
checkGhcVersion = do
res <- ghcVersionChecker
case res of
Failure err -> do
putStrLn $ "Error while checking GHC version: " ++ show err
return Nothing
Mismatch {..} ->
return $ Just GhcVersionMismatch {..}
_ ->
return Nothing
createSession :: ComponentOptions -> IO HscEnvEq
createSession (ComponentOptions theOpts compRoot _) = do
libdir <- getLibdir
cacheDir <- getCacheDir theOpts
hPutStrLn stderr $ "Interface files cache dir: " <> cacheDir
runGhc (Just libdir) $ do
dflags <- getSessionDynFlags
(dflags_, _targets) <- addCmdOpts theOpts dflags
let dflags' = makeDynFlagsAbsolute compRoot dflags_
setupDynFlags cacheDir dflags'
versionMismatch <- liftIO checkGhcVersion
case versionMismatch of
Just mismatch -> return mismatch
Nothing -> do
env <- getSession
liftIO $ initDynLinker env
liftIO $ newHscEnvEq env
getCacheDir :: [String] -> IO FilePath
getCacheDir opts = IO.getXdgDirectory IO.XdgCache (cacheDir </> opts_hash)
where
-- Create a unique folder per set of different GHC options, assuming that each different set of
-- GHC options will create incompatible interface files.
opts_hash = B.unpack $ encode $ H.finalize $ H.updates H.init (map B.pack opts)

View File

@ -1,62 +0,0 @@
module Util (setupDynFlags, getLibdir) where
-- Set the GHC libdir to the nix libdir if it's present.
import qualified GHC.Paths as GHCPaths
import DynFlags ( gopt_unset
, GhcMode(CompManager)
, HscTarget(HscNothing)
, GhcLink(LinkInMemory)
, GeneralFlag
( Opt_IgnoreInterfacePragmas
, Opt_IgnoreOptimChanges
, Opt_WriteInterface
)
, gopt_set
, updOptLevel
, DynFlags(..)
)
import Data.Maybe ( fromMaybe )
import Development.IDE.GHC.Util ( setDefaultHieDir
, dontWriteHieFiles
)
import System.Environment ( lookupEnv )
import GHC (GhcMonad, setSessionDynFlags )
import Data.Functor ( void )
setupDynFlags :: GhcMonad f => FilePath -> DynFlags -> f ()
setupDynFlags cacheDir =
void
. setSessionDynFlags
-- disabled, generated directly by ghcide instead
. flip gopt_unset Opt_WriteInterface
-- disabled, generated directly by ghcide instead
-- also, it can confuse the interface stale check
. dontWriteHieFiles
. setHiDir cacheDir
. setDefaultHieDir cacheDir
. setIgnoreInterfacePragmas
. setLinkerOptions
. disableOptimisation
getLibdir :: IO FilePath
getLibdir = fromMaybe GHCPaths.libdir <$> lookupEnv "NIX_GHC_LIBDIR"
-- we don't want to generate object code so we compile to bytecode
-- (HscInterpreted) which implies LinkInMemory
-- HscInterpreted
setLinkerOptions :: DynFlags -> DynFlags
setLinkerOptions df =
df { ghcLink = LinkInMemory, hscTarget = HscNothing, ghcMode = CompManager }
setIgnoreInterfacePragmas :: DynFlags -> DynFlags
setIgnoreInterfacePragmas df =
gopt_set (gopt_set df Opt_IgnoreInterfacePragmas) Opt_IgnoreOptimChanges
disableOptimisation :: DynFlags -> DynFlags
disableOptimisation df = updOptLevel 0 df
setHiDir :: FilePath -> DynFlags -> DynFlags
setHiDir f d =
-- override user settings to avoid conflicts leading to recompilation
d { hiDir = Just f }

9
exe/Utils.hs Normal file
View File

@ -0,0 +1,9 @@
module Utils (getLibdir) where
import qualified GHC.Paths
import System.Environment
import Data.Maybe
-- Set the GHC libdir to the nix libdir if it's present.
getLibdir :: IO FilePath
getLibdir = fromMaybe GHC.Paths.libdir <$> lookupEnv "NIX_GHC_LIBDIR"

View File

@ -15,7 +15,13 @@ homepage: https://github.com/digital-asset/ghcide#readme
bug-reports: https://github.com/digital-asset/ghcide/issues
tested-with: GHC==8.6.5
extra-source-files: include/ghc-api-version.h README.md CHANGELOG.md
test/data/GotoHover.hs
test/data/hover/*.hs
test/data/multi/cabal.project
test/data/multi/hie.yaml
test/data/multi/a/a.cabal
test/data/multi/a/*.hs
test/data/multi/b/b.cabal
test/data/multi/b/*.hs
source-repository head
type: git
@ -184,6 +190,8 @@ executable ghcide
"-with-rtsopts=-I0 -qg -A128M"
main-is: Main.hs
build-depends:
time,
async,
hslogger,
aeson,
base == 4.*,
@ -211,11 +219,9 @@ executable ghcide
text,
unordered-containers
other-modules:
Utils
Arguments
Paths_ghcide
Rules
RuleTypes
Util
default-extensions:
BangPatterns

View File

@ -1 +1,10 @@
cradle: {stack: {component: "ghcide:lib"}}
cradle:
cabal:
- path: "./src"
component: "ghcide:lib:ghcide"
- path: "./exe"
component: "ghcide:exe:ghcide"
- path: "./test"
component: "ghcide:test:ghcide-tests"
- path: "./test/preprocessor"
component: "ghcide:exe:ghcide-test-preprocessor"

View File

@ -90,14 +90,15 @@ import Exception (ExceptionMonad)
parseModule
:: IdeOptions
-> HscEnv
-> [PackageName]
-> FilePath
-> Maybe SB.StringBuffer
-> IO (IdeResult (StringBuffer, ParsedModule))
parseModule IdeOptions{..} env filename mbContents =
parseModule IdeOptions{..} env comp_pkgs filename mbContents =
fmap (either (, Nothing) id) $
evalGhcEnv env $ runExceptT $ do
(contents, dflags) <- preprocessor filename mbContents
(diag, modu) <- parseFileContents optPreprocessor dflags filename contents
(diag, modu) <- parseFileContents optPreprocessor dflags comp_pkgs filename contents
return (diag, Just (contents, modu))
@ -499,10 +500,11 @@ parseFileContents
:: GhcMonad m
=> (GHC.ParsedSource -> IdePreprocessedSource)
-> DynFlags -- ^ flags to use
-> [PackageName] -- ^ The package imports to ignore
-> FilePath -- ^ the filename (for source locations)
-> SB.StringBuffer -- ^ Haskell module source text (full Unicode is supported)
-> ExceptT [FileDiagnostic] m ([FileDiagnostic], ParsedModule)
parseFileContents customPreprocessor dflags filename contents = do
parseFileContents customPreprocessor dflags comp_pkgs filename contents = do
let loc = mkRealSrcLoc (mkFastString filename) 1 1
case unP Parser.parseModule (mkPState dflags contents loc) of
#if MIN_GHC_API_VERSION(8,10,0)
@ -534,18 +536,34 @@ parseFileContents customPreprocessor dflags filename contents = do
-- Ok, we got here. It's safe to continue.
let IdePreprocessedSource preproc_warns errs parsed = customPreprocessor rdr_module
unless (null errs) $ throwE $ diagFromStrings "parser" DsError errs
let parsed' = removePackageImports comp_pkgs parsed
let preproc_warnings = diagFromStrings "parser" DsWarning preproc_warns
ms <- getModSummaryFromBuffer filename dflags parsed
ms <- getModSummaryFromBuffer filename dflags parsed'
let pm =
ParsedModule {
pm_mod_summary = ms
, pm_parsed_source = parsed
, pm_parsed_source = parsed'
, pm_extra_src_files=[] -- src imports not allowed
, pm_annotations = hpm_annotations
}
warnings = diagFromErrMsgs "parser" dflags warns
pure (warnings ++ preproc_warnings, pm)
-- | After parsing the module remove all package imports referring to
-- these packages as we have already dealt with what they map to.
removePackageImports :: [PackageName] -> GHC.ParsedSource -> GHC.ParsedSource
removePackageImports pkgs (L l h@HsModule {hsmodImports} ) = L l (h { hsmodImports = imports' })
where
imports' = map do_one_import hsmodImports
do_one_import (L l i@ImportDecl{ideclPkgQual}) =
case PackageName . sl_fs <$> ideclPkgQual of
Just pn | pn `elem` pkgs -> L l (i { ideclPkgQual = Nothing })
_ -> L l i
#if MIN_GHC_API_VERSION(8,6,0)
do_one_import l = l
#endif
loadHieFile :: FilePath -> IO GHC.HieFile
loadHieFile f = do
u <- mkSplitUniqSupply 'a'

View File

@ -59,9 +59,11 @@ import Development.IDE.GHC.Error
import Development.Shake hiding (Diagnostic)
import Development.IDE.Core.RuleTypes
import Development.IDE.Spans.Type
import qualified Data.ByteString.Char8 as BS
import qualified GHC.LanguageExtensions as LangExt
import HscTypes
import PackageConfig
import DynFlags (gopt_set, xopt)
import GHC.Generics(Generic)
@ -141,7 +143,6 @@ getHomeHieFile f = do
hie_f = ml_hie_file $ ms_location ms
mbHieTimestamp <- use GetModificationTime normal_hie_f
srcTimestamp <- use_ GetModificationTime f
let isUpToDate
| Just d <- mbHieTimestamp = comparing modificationTime d srcTimestamp == GT
| otherwise = False
@ -191,12 +192,16 @@ priorityFilesOfInterest = Priority (-2)
getParsedModuleRule :: Rules ()
getParsedModuleRule = defineEarlyCutoff $ \GetParsedModule file -> do
hsc <- hscEnv <$> use_ GhcSession file
sess <- use_ GhcSession file
let hsc = hscEnv sess
-- These packages are used when removing PackageImports from a
-- parsed module
comp_pkgs = mapMaybe (fmap fst . mkImportDirs (hsc_dflags hsc)) (deps sess)
opt <- getIdeOptions
(_, contents) <- getFileContents file
let dflags = hsc_dflags hsc
mainParse = getParsedModuleDefinition hsc opt file contents
mainParse = getParsedModuleDefinition hsc opt comp_pkgs file contents
-- Parse again (if necessary) to capture Haddock parse errors
if gopt Opt_Haddock dflags
@ -206,7 +211,7 @@ getParsedModuleRule = defineEarlyCutoff $ \GetParsedModule file -> do
let hscHaddock = hsc{hsc_dflags = gopt_set dflags Opt_Haddock}
haddockParse = do
(_, (!diagsHaddock, _)) <-
getParsedModuleDefinition hscHaddock opt file contents
getParsedModuleDefinition hscHaddock opt comp_pkgs file contents
return diagsHaddock
((fingerPrint, (diags, res)), diagsHaddock) <-
@ -217,9 +222,9 @@ getParsedModuleRule = defineEarlyCutoff $ \GetParsedModule file -> do
return (fingerPrint, (mergeDiagnostics diags diagsHaddock, res))
getParsedModuleDefinition :: HscEnv -> IdeOptions -> NormalizedFilePath -> Maybe T.Text -> IO (Maybe ByteString, ([FileDiagnostic], Maybe ParsedModule))
getParsedModuleDefinition packageState opt file contents = do
(diag, res) <- parseModule opt packageState (fromNormalizedFilePath file) (fmap textToStringBuffer contents)
getParsedModuleDefinition :: HscEnv -> IdeOptions -> [PackageName] -> NormalizedFilePath -> Maybe T.Text -> IO (Maybe ByteString, ([FileDiagnostic], Maybe ParsedModule))
getParsedModuleDefinition packageState opt comp_pkgs file contents = do
(diag, res) <- parseModule opt packageState comp_pkgs (fromNormalizedFilePath file) (fmap textToStringBuffer contents)
case res of
Nothing -> pure (Nothing, (diag, Nothing))
Just (contents, modu) -> do
@ -233,11 +238,13 @@ getLocatedImportsRule =
define $ \GetLocatedImports file -> do
ms <- use_ GetModSummary file
let imports = [(False, imp) | imp <- ms_textual_imps ms] ++ [(True, imp) | imp <- ms_srcimps ms]
env <- hscEnv <$> use_ GhcSession file
env_eq <- use_ GhcSession file
let env = hscEnv env_eq
let import_dirs = deps env_eq
let dflags = addRelativeImport file (moduleName $ ms_mod ms) $ hsc_dflags env
opt <- getIdeOptions
(diags, imports') <- fmap unzip $ forM imports $ \(isSource, (mbPkgName, modName)) -> do
diagOrImp <- locateModule dflags (optExtensions opt) getFileExists modName mbPkgName isSource
diagOrImp <- locateModule dflags import_dirs (optExtensions opt) getFileExists modName mbPkgName isSource
case diagOrImp of
Left diags -> pure (diags, Left (modName, Nothing))
Right (FileImport path) -> pure ([], Left (modName, Just path))
@ -522,7 +529,7 @@ instance Hashable GhcSessionIO
instance NFData GhcSessionIO
instance Binary GhcSessionIO
newtype GhcSessionFun = GhcSessionFun (FilePath -> Action HscEnvEq)
newtype GhcSessionFun = GhcSessionFun (FilePath -> Action (IdeResult HscEnvEq))
instance Show GhcSessionFun where show _ = "GhcSessionFun"
instance NFData GhcSessionFun where rnf !_ = ()
@ -532,11 +539,26 @@ loadGhcSession = do
defineNoFile $ \GhcSessionIO -> do
opts <- getIdeOptions
GhcSessionFun <$> optGhcSession opts
-- This function should always be rerun because it consults a cache to
-- see what HscEnv needs to be used for the file, which can change.
-- However, it should also cut-off early if it's the same HscEnv as
-- last time
defineEarlyCutoff $ \GhcSession file -> do
GhcSessionFun fun <- useNoFile_ GhcSessionIO
alwaysRerun
val <- fun $ fromNormalizedFilePath file
-- TODO: What was this doing before?
opts <- getIdeOptions
return ("" <$ optShakeFiles opts, ([], Just val))
let cutoffHash =
case optShakeFiles opts of
-- optShakeFiles is only set in the DAML case.
-- https://github.com/digital-asset/ghcide/pull/522#discussion_r428622915
Just {} -> ""
-- Hash the HscEnvEq returned so cutoff if it didn't change
-- from last time
Nothing -> BS.pack (show (hash (snd val)))
return (Just cutoffHash, val)
getHiFileRule :: Rules ()
getHiFileRule = defineEarlyCutoff $ \GetHiFile f -> do
@ -601,12 +623,16 @@ getModIfaceRule = define $ \GetModIface f -> do
-- the interface file does not exist or is out of date.
-- Invoke typechecking directly to update it without incurring a dependency
-- on the parsed module and the typecheck rules
hsc <- hscEnv <$> use_ GhcSession f
sess <- use_ GhcSession f
let hsc = hscEnv sess
-- After parsing the module remove all package imports referring to
-- these packages as we have already dealt with what they map to.
comp_pkgs = mapMaybe (fmap fst . mkImportDirs (hsc_dflags hsc)) (deps sess)
opt <- getIdeOptions
(_, contents) <- getFileContents f
-- Embed --haddocks in the interface file
hsc <- pure hsc{hsc_dflags = gopt_set (hsc_dflags hsc) Opt_Haddock}
(_, (diags, mb_pm)) <- liftIO $ getParsedModuleDefinition hsc opt f contents
(_, (diags, mb_pm)) <- liftIO $ getParsedModuleDefinition hsc opt comp_pkgs f contents
case mb_pm of
Nothing -> return (diags, Nothing)
Just pm -> do

View File

@ -171,9 +171,6 @@ instance Hashable Key where
-- get empty diagnostics and a Nothing, to indicate this phase throws no fresh
-- errors but still failed.
--
-- A rule on a file should only return diagnostics for that given file. It should
-- not propagate diagnostic errors through multiple phases.
type IdeResult v = ([FileDiagnostic], Maybe v)
data Value v
= Succeeded TextDocumentVersion v

View File

@ -26,6 +26,7 @@ module Development.IDE.GHC.Compat(
includePathsQuote,
addIncludePathsQuote,
getModuleHash,
getPackageName,
pattern DerivD,
pattern ForD,
pattern InstD,
@ -47,6 +48,7 @@ import DynFlags
import FieldLabel
import Fingerprint (Fingerprint)
import qualified Module
import Packages
import qualified GHC
import GHC hiding (
@ -302,3 +304,6 @@ getConArgs = GHC.getConArgs
#else
getConArgs = GHC.getConDetails
#endif
getPackageName :: DynFlags -> Module.InstalledUnitId -> Maybe PackageName
getPackageName dfs i = packageName <$> lookupPackage dfs (Module.DefiniteUnitId (Module.DefUnitId i))

View File

@ -8,6 +8,7 @@ module Development.IDE.GHC.Util(
modifyDynFlags,
evalGhcEnv,
runGhcEnv,
deps,
-- * GHC wrappers
prettyPrint,
printRdrName,
@ -64,7 +65,7 @@ import Packages (getPackageConfigMap, lookupPackage')
import SrcLoc (mkRealSrcLoc)
import FastString (mkFastString)
import DynFlags (emptyFilesToClean, unsafeGlobalDynFlags)
import Module (moduleNameSlashes)
import Module (moduleNameSlashes, InstalledUnitId)
import OccName (parenSymOcc)
import RdrName (nameRdrName, rdrNameOcc)
@ -166,6 +167,9 @@ moduleImportPath (takeDirectory . fromNormalizedFilePath -> pathDir) mn
-- if they are created with the same call to 'newHscEnvEq'.
data HscEnvEq
= HscEnvEq !Unique !HscEnv
[(InstalledUnitId, DynFlags)] -- In memory components for this HscEnv
-- This is only used at the moment for the import dirs in
-- the DynFlags
| GhcVersionMismatch { compileTime :: !Version
, runTime :: !Version
}
@ -175,7 +179,7 @@ hscEnv :: HscEnvEq -> HscEnv
hscEnv = either error id . hscEnv'
hscEnv' :: HscEnvEq -> Either String HscEnv
hscEnv' (HscEnvEq _ x) = Right x
hscEnv' (HscEnvEq _ x _) = Right x
hscEnv' GhcVersionMismatch{..} = Left $
unwords
["ghcide compiled against GHC"
@ -185,25 +189,29 @@ hscEnv' GhcVersionMismatch{..} = Left $
,". This is unsupported, ghcide must be compiled with the same GHC version as the project."
]
deps :: HscEnvEq -> [(InstalledUnitId, DynFlags)]
deps (HscEnvEq _ _ u) = u
deps GhcVersionMismatch{} = []
-- | Wrap an 'HscEnv' into an 'HscEnvEq'.
newHscEnvEq :: HscEnv -> IO HscEnvEq
newHscEnvEq e = do u <- newUnique; return $ HscEnvEq u e
newHscEnvEq :: HscEnv -> [(InstalledUnitId, DynFlags)] -> IO HscEnvEq
newHscEnvEq e uids = do u <- newUnique; return $ HscEnvEq u e uids
instance Show HscEnvEq where
show (HscEnvEq a _) = "HscEnvEq " ++ show (hashUnique a)
show (HscEnvEq a _ _) = "HscEnvEq " ++ show (hashUnique a)
show GhcVersionMismatch{..} = "GhcVersionMismatch " <> show (compileTime, runTime)
instance Eq HscEnvEq where
HscEnvEq a _ == HscEnvEq b _ = a == b
HscEnvEq a _ _ == HscEnvEq b _ _ = a == b
GhcVersionMismatch a b == GhcVersionMismatch c d = a == c && b == d
_ == _ = False
instance NFData HscEnvEq where
rnf (HscEnvEq a b) = rnf (hashUnique a) `seq` b `seq` ()
rnf (HscEnvEq a b c) = rnf (hashUnique a) `seq` b `seq` c `seq` ()
rnf GhcVersionMismatch{} = rnf runTime
instance Hashable HscEnvEq where
hashWithSalt salt (HscEnvEq u _) = hashWithSalt salt u
hashWithSalt s (HscEnvEq a _b _c) = hashWithSalt s a
hashWithSalt salt GhcVersionMismatch{..} = hashWithSalt salt (compileTime, runTime)
-- Fake instance needed to persuade Shake to accept this type as a key.

View File

@ -62,7 +62,7 @@ data ModuleImports = ModuleImports
-- that module on disk (if we found it)
, packageImports :: !(Set InstalledUnitId)
-- ^ Transitive package dependencies unioned for all imports.
}
} deriving Show
-- | For processing dependency information, we need lots of maps and sets of
-- filepaths. Comparing Strings is really slow, so we work with IntMap/IntSet
@ -128,7 +128,7 @@ data RawDependencyInformation = RawDependencyInformation
-- need to add edges between .hs-boot and .hs so that the .hs files
-- appear later in the sort.
, rawBootMap :: !BootIdMap
}
} deriving Show
pkgDependencies :: RawDependencyInformation -> FilePathIdMap (Set InstalledUnitId)
pkgDependencies RawDependencyInformation{..} =

View File

@ -10,16 +10,16 @@ module Development.IDE.Import.FindImports
, ArtifactsLocation(..)
, modSummaryToArtifactsLocation
, isBootLocation
, mkImportDirs
) where
import Development.IDE.GHC.Error as ErrUtils
import Development.IDE.GHC.Orphans()
import Development.IDE.Types.Diagnostics
import Development.IDE.Types.Location
import Development.IDE.GHC.Compat
-- GHC imports
import DynFlags
import FastString
import GHC
import qualified Module as M
import Packages
import Outputable (showSDoc, ppr, pprPanic)
@ -31,6 +31,7 @@ import Control.Monad.Extra
import Control.Monad.IO.Class
import System.FilePath
import DriverPhases
import Data.Maybe
data Import
= FileImport !ArtifactsLocation
@ -63,55 +64,72 @@ modSummaryToArtifactsLocation nfp ms = ArtifactsLocation nfp (ms_location ms) (i
-- | locate a module in the file system. Where we go from *daml to Haskell
locateModuleFile :: MonadIO m
=> DynFlags
=> [[FilePath]]
-> [String]
-> (NormalizedFilePath -> m Bool)
-> Bool
-> ModuleName
-> m (Maybe NormalizedFilePath)
locateModuleFile dflags exts doesExist isSource modName = do
let candidates =
locateModuleFile import_dirss exts doesExist isSource modName = do
let candidates import_dirs =
[ toNormalizedFilePath' (prefix </> M.moduleNameSlashes modName <.> maybeBoot ext)
| prefix <- importPaths dflags, ext <- exts]
findM doesExist candidates
| prefix <- import_dirs , ext <- exts]
findM doesExist (concatMap candidates import_dirss)
where
maybeBoot ext
| isSource = ext ++ "-boot"
| otherwise = ext
-- | This function is used to map a package name to a set of import paths.
-- It only returns Just for unit-ids which are possible to import into the
-- current module. In particular, it will return Nothing for 'main' components
-- as they can never be imported into another package.
mkImportDirs :: DynFlags -> (M.InstalledUnitId, DynFlags) -> Maybe (PackageName, [FilePath])
mkImportDirs df (i, DynFlags{importPaths}) = (, importPaths) <$> getPackageName df i
-- | locate a module in either the file system or the package database. Where we go from *daml to
-- Haskell
locateModule
:: MonadIO m
=> DynFlags
-> [(M.InstalledUnitId, DynFlags)] -- Sets import directories to look in
-> [String]
-> (NormalizedFilePath -> m Bool)
-> Located ModuleName
-> Maybe FastString
-> Bool
-> m (Either [FileDiagnostic] Import)
locateModule dflags exts doesExist modName mbPkgName isSource = do
locateModule dflags comp_info exts doesExist modName mbPkgName isSource = do
case mbPkgName of
-- "this" means that we should only look in the current package
Just "this" -> do
mbFile <- locateModuleFile dflags exts doesExist isSource $ unLoc modName
case mbFile of
Nothing -> return $ Left $ notFoundErr dflags modName $ LookupNotFound []
Just file -> toModLocation file
lookupLocal [importPaths dflags]
-- if a package name is given we only go look for a package
Just _pkgName -> lookupInPackageDB dflags
Just pkgName
| Just dirs <- lookup (PackageName pkgName) import_paths
-> lookupLocal [dirs]
| otherwise -> lookupInPackageDB dflags
Nothing -> do
-- first try to find the module as a file. If we can't find it try to find it in the package
-- database.
mbFile <- locateModuleFile dflags exts doesExist isSource $ unLoc modName
-- Here the importPaths for the current modules are added to the front of the import paths from the other components.
-- This is particularly important for Paths_* modules which get generated for every component but unless you use it in
-- each component will end up being found in the wrong place and cause a multi-cradle match failure.
mbFile <- locateModuleFile (importPaths dflags : map snd import_paths) exts doesExist isSource $ unLoc modName
case mbFile of
Nothing -> lookupInPackageDB dflags
Just file -> toModLocation file
where
import_paths = mapMaybe (mkImportDirs dflags) comp_info
toModLocation file = liftIO $ do
loc <- mkHomeModLocation dflags (unLoc modName) (fromNormalizedFilePath file)
return $ Right $ FileImport $ ArtifactsLocation file loc (not isSource)
lookupLocal dirs = do
mbFile <- locateModuleFile dirs exts doesExist isSource $ unLoc modName
case mbFile of
Nothing -> return $ Left $ notFoundErr dflags modName $ LookupNotFound []
Just file -> toModLocation file
lookupInPackageDB dfs =
case lookupModuleWithSuggestions dfs (unLoc modName) mbPkgName of

View File

@ -6,6 +6,7 @@ module Development.IDE.Types.Diagnostics (
LSP.Diagnostic(..),
ShowDiagnostic(..),
FileDiagnostic,
IdeResult,
LSP.DiagnosticSeverity(..),
DiagnosticStore,
List(..),
@ -31,6 +32,9 @@ import Data.Text.Prettyprint.Doc.Render.Terminal (Color(..), color)
import Development.IDE.Types.Location
-- A rule on a file should only return diagnostics for that given file. It should
-- not propagate diagnostic errors through multiple phases.
type IdeResult v = ([FileDiagnostic], Maybe v)
ideErrorText :: NormalizedFilePath -> T.Text -> FileDiagnostic
ideErrorText = ideErrorWithSource (Just "compiler") (Just DsError)

View File

@ -12,6 +12,7 @@ module Development.IDE.Types.Options
, clientSupportsProgress
, IdePkgLocationOptions(..)
, defaultIdeOptions
, IdeResult
) where
import Development.Shake
@ -20,12 +21,13 @@ import GHC hiding (parseModule, typecheckModule)
import GhcPlugins as GHC hiding (fst3, (<>))
import qualified Language.Haskell.LSP.Types.Capabilities as LSP
import qualified Data.Text as T
import Development.IDE.Types.Diagnostics
data IdeOptions = IdeOptions
{ optPreprocessor :: GHC.ParsedSource -> IdePreprocessedSource
-- ^ Preprocessor to run over all parsed source trees, generating a list of warnings
-- and a list of errors, along with a new parse tree.
, optGhcSession :: Action (FilePath -> Action HscEnvEq)
, optGhcSession :: Action (FilePath -> Action (IdeResult HscEnvEq))
-- ^ Setup a GHC session for a given file, e.g. @Foo.hs@.
-- For the same 'ComponentOptions' from hie-bios, the resulting function will be applied once per file.
-- It is desirable that many files get the same 'HscEnvEq', so that more IDE features work.
@ -76,7 +78,7 @@ clientSupportsProgress :: LSP.ClientCapabilities -> IdeReportProgress
clientSupportsProgress caps = IdeReportProgress $ Just True ==
(LSP._workDoneProgress =<< LSP._window (caps :: LSP.ClientCapabilities))
defaultIdeOptions :: Action (FilePath -> Action HscEnvEq) -> IdeOptions
defaultIdeOptions :: Action (FilePath -> Action (IdeResult HscEnvEq)) -> IdeOptions
defaultIdeOptions session = IdeOptions
{optPreprocessor = IdePreprocessedSource [] []
,optGhcSession = session

3
test/data/multi/a/A.hs Normal file
View File

@ -0,0 +1,3 @@
module A(foo) where
foo = ()

View File

@ -0,0 +1,9 @@
name: a
version: 1.0.0
build-type: Simple
cabal-version: >= 1.2
library
build-depends: base
exposed-modules: A
hs-source-dirs: .

3
test/data/multi/b/B.hs Normal file
View File

@ -0,0 +1,3 @@
module B(module B) where
import A
qux = foo

View File

@ -0,0 +1,9 @@
name: b
version: 1.0.0
build-type: Simple
cabal-version: >= 1.2
library
build-depends: base, a
exposed-modules: B
hs-source-dirs: .

View File

@ -0,0 +1 @@
packages: a b

6
test/data/multi/hie.yaml Normal file
View File

@ -0,0 +1,6 @@
cradle:
cabal:
- path: "./a"
component: "lib:a"
- path: "./b"
component: "lib:b"

View File

@ -10,7 +10,7 @@
module Main (main) where
import Control.Applicative.Combinators
import Control.Exception (catch)
import Control.Exception (bracket, catch)
import qualified Control.Lens as Lens
import Control.Monad
import Control.Monad.IO.Class (liftIO)
@ -35,7 +35,7 @@ import Language.Haskell.LSP.Types.Capabilities
import qualified Language.Haskell.LSP.Types.Lens as Lsp (diagnostics, params, message)
import Language.Haskell.LSP.VFS (applyChange)
import Network.URI
import System.Environment.Blank (setEnv)
import System.Environment.Blank (getEnv, setEnv, unsetEnv)
import System.FilePath
import System.IO.Extra
import System.Directory
@ -49,32 +49,35 @@ import Test.Tasty.QuickCheck
import Data.Maybe
main :: IO ()
main = defaultMainWithRerun $ testGroup "HIE"
[ testSession "open close" $ do
doc <- createDoc "Testing.hs" "haskell" ""
void (skipManyTill anyMessage message :: Session WorkDoneProgressCreateRequest)
void (skipManyTill anyMessage message :: Session WorkDoneProgressBeginNotification)
closeDoc doc
void (skipManyTill anyMessage message :: Session WorkDoneProgressEndNotification)
, initializeResponseTests
, completionTests
, cppTests
, diagnosticTests
, codeActionTests
, codeLensesTests
, outlineTests
, findDefinitionAndHoverTests
, pluginTests
, preprocessorTests
, thTests
, safeTests
, unitTests
, haddockTests
, positionMappingTests
, watchedFilesTests
, cradleTests
, dependentFileTest
]
main = do
-- We mess with env vars so run single-threaded.
setEnv "TASTY_NUM_THREADS" "1" True
defaultMainWithRerun $ testGroup "HIE"
[ testSession "open close" $ do
doc <- createDoc "Testing.hs" "haskell" ""
void (skipManyTill anyMessage message :: Session WorkDoneProgressCreateRequest)
void (skipManyTill anyMessage message :: Session WorkDoneProgressBeginNotification)
closeDoc doc
void (skipManyTill anyMessage message :: Session WorkDoneProgressEndNotification)
, initializeResponseTests
, completionTests
, cppTests
, diagnosticTests
, codeActionTests
, codeLensesTests
, outlineTests
, findDefinitionAndHoverTests
, pluginTests
, preprocessorTests
, thTests
, safeTests
, unitTests
, haddockTests
, positionMappingTests
, watchedFilesTests
, cradleTests
, dependentFileTest
]
initializeResponseTests :: TestTree
initializeResponseTests = withResource acquire release tests where
@ -1293,27 +1296,19 @@ addSigLensesTests = let
]
]
findDefinitionAndHoverTests :: TestTree
findDefinitionAndHoverTests = let
checkDefs :: [Location] -> Session [Expect] -> Session ()
checkDefs defs mkExpectations = traverse_ check =<< mkExpectations where
tst (get, check) pos targetRange title = testSession title $ do
doc <- openTestDataDoc sourceFilePath
found <- get doc pos
check found targetRange
checkDefs :: [Location] -> Session [Expect] -> Session ()
checkDefs defs mkExpectations = traverse_ check =<< mkExpectations where
check (ExpectRange expectedRange) = do
assertNDefinitionsFound 1 defs
assertRangeCorrect (head defs) expectedRange
check (ExpectLocation expectedLocation) = do
assertNDefinitionsFound 1 defs
liftIO $ head defs @?= expectedLocation
check ExpectNoDefinitions = do
assertNDefinitionsFound 0 defs
check ExpectExternFail = liftIO $ assertFailure "Expecting to fail to find in external file"
check _ = pure () -- all other expectations not relevant to getDefinition
check (ExpectRange expectedRange) = do
assertNDefinitionsFound 1 defs
assertRangeCorrect (head defs) expectedRange
check (ExpectLocation expectedLocation) = do
assertNDefinitionsFound 1 defs
liftIO $ head defs @?= expectedLocation
check ExpectNoDefinitions = do
assertNDefinitionsFound 0 defs
check ExpectExternFail = liftIO $ assertFailure "Expecting to fail to find in external file"
check _ = pure () -- all other expectations not relevant to getDefinition
assertNDefinitionsFound :: Int -> [a] -> Session ()
assertNDefinitionsFound n defs = liftIO $ assertEqual "number of definitions" n (length defs)
@ -1321,6 +1316,17 @@ findDefinitionAndHoverTests = let
assertRangeCorrect Location{_range = foundRange} expectedRange =
liftIO $ expectedRange @=? foundRange
findDefinitionAndHoverTests :: TestTree
findDefinitionAndHoverTests = let
tst (get, check) pos targetRange title = testSessionWithExtraFiles "hover" title $ \dir -> do
doc <- openTestDataDoc (dir </> sourceFilePath)
found <- get doc pos
check found targetRange
checkHover :: Maybe Hover -> Session [Expect] -> Session ()
checkHover hover expectations = traverse_ check =<< expectations where
@ -1463,8 +1469,10 @@ findDefinitionAndHoverTests = let
checkFileCompiles :: FilePath -> TestTree
checkFileCompiles fp =
testSessionWait ("Does " ++ fp ++ " compile") $
void (openTestDataDoc fp)
testSessionWithExtraFiles "hover" ("Does " ++ fp ++ " compile") $ \dir -> do
void (openTestDataDoc (dir </> fp))
expectNoMoreDiagnostics 0.5
pluginTests :: TestTree
@ -2025,6 +2033,7 @@ cradleTests :: TestTree
cradleTests = testGroup "cradle"
[testGroup "dependencies" [sessionDepsArePickedUp]
,testGroup "loading" [loadCradleOnlyonce]
,testGroup "multi" [simpleMultiTest, simpleMultiTest2]
]
loadCradleOnlyonce :: TestTree
@ -2094,6 +2103,56 @@ cradleLoadedMessage = satisfy $ \case
cradleLoadedMethod :: T.Text
cradleLoadedMethod = "ghcide/cradle/loaded"
-- Stack sets this which trips up cabal in the multi-component tests.
-- However, our plugin tests rely on those env vars so we unset it locally.
withoutStackEnv :: IO a -> IO a
withoutStackEnv s =
bracket
(mapM getEnv vars >>= \prevState -> mapM_ unsetEnv vars >> pure prevState)
(\prevState -> mapM_ (\(var, value) -> restore var value) (zip vars prevState))
(const s)
where vars =
[ "GHC_PACKAGE_PATH"
, "GHC_ENVIRONMENT"
, "HASKELL_DIST_DIR"
, "HASKELL_PACKAGE_SANDBOX"
, "HASKELL_PACKAGE_SANDBOXES"
]
restore var Nothing = unsetEnv var
restore var (Just val) = setEnv var val True
simpleMultiTest :: TestTree
simpleMultiTest = testCase "simple-multi-test" $ withoutStackEnv $ runWithExtraFiles "multi" $ \dir -> do
let aPath = dir </> "a/A.hs"
bPath = dir </> "b/B.hs"
aSource <- liftIO $ readFileUtf8 aPath
(TextDocumentIdentifier adoc) <- createDoc aPath "haskell" aSource
expectNoMoreDiagnostics 0.5
bSource <- liftIO $ readFileUtf8 bPath
bdoc <- createDoc bPath "haskell" bSource
expectNoMoreDiagnostics 0.5
locs <- getDefinitions bdoc (Position 2 7)
let fooL = mkL adoc 2 0 2 3
checkDefs locs (pure [fooL])
expectNoMoreDiagnostics 0.5
-- Like simpleMultiTest but open the files in the other order
simpleMultiTest2 :: TestTree
simpleMultiTest2 = testCase "simple-multi-test2" $ withoutStackEnv $ runWithExtraFiles "multi" $ \dir -> do
let aPath = dir </> "a/A.hs"
bPath = dir </> "b/B.hs"
bSource <- liftIO $ readFileUtf8 bPath
bdoc <- createDoc bPath "haskell" bSource
expectNoMoreDiagnostics 5
aSource <- liftIO $ readFileUtf8 aPath
(TextDocumentIdentifier adoc) <- createDoc aPath "haskell" aSource
-- Need to have some delay here or the test fails
expectNoMoreDiagnostics 5
locs <- getDefinitions bdoc (Position 2 7)
let fooL = mkL adoc 2 0 2 3
checkDefs locs (pure [fooL])
expectNoMoreDiagnostics 0.5
sessionDepsArePickedUp :: TestTree
sessionDepsArePickedUp = testSession'
"session-deps-are-picked-up"
@ -2138,6 +2197,9 @@ sessionDepsArePickedUp = testSession'
testSession :: String -> Session () -> TestTree
testSession name = testCase name . run
testSessionWithExtraFiles :: FilePath -> String -> (FilePath -> Session ()) -> TestTree
testSessionWithExtraFiles prefix name = testCase name . runWithExtraFiles prefix
testSession' :: String -> (FilePath -> Session ()) -> TestTree
testSession' name = testCase name . run'
@ -2172,6 +2234,19 @@ mkRange a b c d = Range (Position a b) (Position c d)
run :: Session a -> IO a
run s = withTempDir $ \dir -> runInDir dir s
runWithExtraFiles :: FilePath -> (FilePath -> Session a) -> IO a
runWithExtraFiles prefix s = withTempDir $ \dir -> do
copyTestDataFiles dir prefix
runInDir dir (s dir)
copyTestDataFiles :: FilePath -> FilePath -> IO ()
copyTestDataFiles dir prefix = do
-- Copy all the test data files to the temporary workspace
testDataFiles <- getDirectoryFilesIO ("test/data" </> prefix) ["//*"]
for_ testDataFiles $ \f -> do
createDirectoryIfMissing True $ dir </> takeDirectory f
copyFile ("test/data" </> prefix </> f) (dir </> f)
run' :: (FilePath -> Session a) -> IO a
run' s = withTempDir $ \dir -> runInDir dir (s dir)
@ -2183,11 +2258,6 @@ runInDir dir s = do
-- since the package import test creates "Data/List.hs", which otherwise has no physical home
createDirectoryIfMissing True $ dir ++ "/Data"
-- Copy all the test data files to the temporary workspace
testDataFiles <- getDirectoryFilesIO "test/data" ["//*"]
for_ testDataFiles $ \f -> do
createDirectoryIfMissing True $ dir </> takeDirectory f
copyFile ("test/data" </> f) (dir </> f)
let cmd = unwords [ghcideExe, "--lsp", "--test", "--cwd", dir]
-- HIE calls getXgdDirectory which assumes that HOME is set.