This is a workaround to avoid the error: "java.io.IOException: RSA premaster
secret error".
In Java Web Start and the Java web plugin, there seems to be a Java policy
that prevents untrusted code from being loaded, and (probably for security
reasons) it doesn't like the files in the JDK's lib/icedtea/jre/lib/ext
directory to be symlinks.
Worked around it by copying those files instead of symlinking them.
When using -fsanitize=... options clang implicitly
links binary to static libraries which are part of
llvm, but expects them to be found under clang prefix
fsharp: Upgrade to fsharp-3.0
[ Shea: I'm merging thoughtpolice's commits without further testing
beyond code inspection, as I believe their work has proven itself ]
upgrade Mono -> 3.2.8 + LLVM support
[ Shea: I'm merging thoughtpolice's commits without further testing
beyond code inspection, as I believe their work has proven itself ]
On MinGW, we're passing these programs to the configure script, but this
obviously won't work for non-autoconf-based projects.
Signed-off-by: aszlig <aszlig@redmoonstudios.org>
Both branches have quite a lot in common, so it's time for a merge and
do the cleanups with respect to both implementations and also generalize
both implementations as much as possible.
This also closes#1876.
Conflicts:
pkgs/development/interpreters/lua-5/5.2.nix
pkgs/development/libraries/SDL/default.nix
pkgs/development/libraries/glew/default.nix
pkgs/top-level/all-packages.nix
Cross-compiling stuff against Mac OS X's CoreFoundation won't work
without ObjC support, and we don't want to compile commandline utilities
only, right?
Signed-off-by: aszlig <aszlig@redmoonstudios.org>
Let's finally hook everything into the existing cross-building
infrastructure. We're using --with-sysroot instead of --with-headers
here, because the XCode SDK contains references to /usr/lib.
I've tried to patch those references, but unfortunately (at least with
install_name_tool) it isn't possible to change those refernces in stub
dylibs.
So after bugging @tpoechtrager with annoying questions (thanks again), I
think my initial approach (patching the SDK itself and/or regenerating
the dylib stubs) was way to complicated so I ended up with this
implementation.
Also, I've added a condition to binutilsCross to use cctools if the libc
is set to libSystem. This might need some cleanups someday, mainly to
figure out how to properly bridge cctools and binutils.
So, as an example on how to cross-compile GNU Hello to Darwin, you can
use something like this:
(import <nixpkgs> {
crossSystem = {
config = "x86_64-apple-darwin13";
arch = "x86_64";
libc = "libSystem";
platform = {};
};
}).hello.crossDrv
Signed-off-by: aszlig <aszlig@redmoonstudios.org>
This adds LLVM support in Mono using their custom fork, based on LLVM
3.4svn upstream. Somehow, it's gone for a while without a patch to fix
the CMakeLists.txt in the fork...
The upstream commit is based on the mono3 branch.
Signed-off-by: Austin Seipp <aseipp@pobox.com>
Mingw(32) is rather poorly maintaned and has quite a lot of bugs. And
because our Windows cross builds were also poorly maintained and most of
the cross-tests were broken as well, I'm just taking this step and try
to switch to mingw-w64 for everything "cross Windows".
Signed-off-by: aszlig <aszlig@redmoonstudios.org>
This removes the need for hacks like stdenv.regenerate. It also
ensures that overrideGCC is now stackable (so ‘stdenv = useGoldLinker
clangStdenv’ works).
HotSpot uses the absolute path of libjvm.so to determine the java.home
property (ignoring $JAVA_HOME), which is in turn used by
ToolProvider.getSystemJavaCompiler() to load tools.jar. So we need to
do some trickery to ensure that if java gets invoked from the jdk
output (ratherthan the jre output), it finds libjvm.so in the jdk output.
It sucks, I know, but GHC just doesn't compile reliably when built with
some -j<n> option. :-( We have build errors because of apparent race
conditions all over the place on Hydra. This causes so much trouble for
users that it's not worth keeping this option enabled, IMHO.
Now most packages in the llvm suite are built as separate derivations.
The exceptions are:
* compiler-rt must currently be built with llvm. This increases llvm's
size by 6 MB
* clang-tools-extra must be built with clang
In addition, the top-level llvm attribute is defaulted to llvm 3.4, and
llvm 3.3 must be accessed by the llvm_33 attribute. This is to make the
out-of-date packages obvious in the hope that eventually all will be
updated to work with 3.4 and 3.3 can be removed. I think we should keep
this policy in the future (latest llvm gets top-level name, the rest are
versioned until they can be removed).
The llvm packages (except libc++, which exception I will try to remove
on the next update) can all be accessed via the llvmPackages attribute,
and there are also aliases for the packages that already existed (llvm,
clang, and dragonegg).
Signed-off-by: Shea Levy <shea@shealevy.com>
Some packages in the llvm suite (e.g. compiler-rt) cannot be built
separate from the build of llvm, and while some others (e.g. clang) can
the combined build is much better tested (we've had to work around
annoying issues before). So this puts llvm, clang, clang-tools-extra,
compiler-rt, lld, lldb, and polly all into one big build (llvmFull).
This build includes a static llvm, as dynamic is similarly less tested
and has known failures.
This also updates libc++ and dragonegg. libc++ now builds against
libc++abi as a separate package rather than building it during the
libc++ build.
The clang purity patch is gone. Instead, we simply set --sysroot to
/var/empty for pure builds, as all impure paths are either looked up in
the gcc prefix (which we hard-code at compile time) or in the sysroot.
This also means that if NIX_ENFORCE_PURITY is 0 then clang will look in
the normal Linux paths by default, which is the proper behavior IMO.
polly required an updated isl. When stdenv-updates is merged, perhaps we
can update the isl used by gcc and avoid having two versions.
Since llvm on its own is now separate from the llvm used by clang, I've
removed myself as maintainer from llvm and will leave maintenance of
that to those who are interested in llvm separate from clang.
Signed-off-by: Shea Levy <shea@shealevy.com>
Specifically, we are trying to fix the following error seen on Hydra:
../../gcc-4.7.3/gcc/gengtype-lex.c:1:21: fatal error: bconfig.h: No such file or directory
The patch is taken from gcc's SVN revision 193691.
The 3.4 code was tested preliminary in x-updates,
described by 2e4eab1228.
Updates to llvm break builds of dependent packages (in all cases I've seen),
and often upstream isn't too fast in porting to the newest version.
Consequently, it seems better to keep more versions (two ATM),
both in one file to share eventual changes.
Also, using versioned llvm_* attributes is proposed because of this.
This unifies the "openjdk" and "openjre" packages. The JDK is placed
in the "out" output, the JRE in "jre".
Also, everything is now stored in $prefix/lib/openjdk, so the JDK/JRE
no longer pollute user environments with files like
"ASSEMBLY_EXCEPTION" at top-level.
It's to separate from other changes coming from master.
Conflicts:
pkgs/development/libraries/glibc/2.18/common.nix (taking both changes)
pkgs/development/libraries/ncurses/5_4.nix (deleted)
Sometimes the build failes with:
In file included from ../../gcc-4.4.6/gcc/ada/seh_init.c:44:
../../gcc-4.4.6/gcc/system.h:418: error: conflicting types for 'strsignal'
/nix/store/6h129q168ahnl2nzw6azr239cba884ng-glibc-2.18/include/string.h:560: note: previous declaration of 'strsignal' was here
and sometimes it doesn't. Hopefully disabling parallel builds fixes
this.
http://hydra.nixos.org/build/7179481
SDCC 3.3.0 Feature List:
* Many small improvements in code generation for the z80-related ports - merged smallopts branch
* lospre (currently enabled for z80-related and hc08-related ports only) - merged lospre branch
* More efficient initialization of globals in z80, z180, r2k and r3ka ports.
* Inclusion of tests from the gcc test suite into the sdcc regression test suite led to many bugs being found and fixed.
* Split sdas390 from sdas8051
* Merged big parts of ASxxxx v5 into sdas
* New pic devices (synchronization with MPLABX 1.60). (Except for very old MCU-s.)
* New script which disassembles those hex files, in which MCS51 code there is. (mcs51-disasm.pl)
* Added the PIC16F1788 and PIC16F1789 devices.
* C11 _Alignof operator.
* C11 _Alignas alignment specifier.
* C11 _Static_Assert static assertion.
Numerous feature requests and bug fixes are included as well.
Apparently Apple thinks that faking gcc wiht a clang stub but not providing a
compatile libgcc.a is not going to cause any issues. They really do hate GPL.
1) The wrapper erroneously used the ghc-pkg flag "--package-db" instead of
"--global-package-db". The result was that packages installed locally in
~/.ghc and ~/.cabal were invisible to GHC. This has been fixed.
2) The wrapper now deals gracefully with an empty package set: if no package
is requested to be included in the wrapped environment, the wrapper just
installs a pristine GHC.
3) Correctly configure the "docdir" path returned by ghc-paths.
4) Added some comments that describe the idea behind our ghc-paths patches and
gives users same sample shell code that can be used to import our special
environment variables into the currently running shell, so that programs
outside of the wrapped environment can use them, too.
The ghcWithPackage expression now has an argument 'ignoreCollisions' that
allows users to disable the path collision check like so:
(pkgs.haskellPackages.ghcWithPackages (pkgs: with pkgs; [ haskellPlatform ])).override { ignoreCollisions = true; };
See d64917ad17
for a long and detailed discussion of why these path collisions may occur.
Haskell packages -- i.e. packages built by our Cabal builder -- invariably have
the attributes 'pname' and 'version'. We use the absence of these attributes to
recognize non-Haskell packages and filter them from the closed package set
generated by closePropagation. We do this so that the generated Haskell
environment won't contain paths like "/lib/libz.a", which are part of the
closure but have nothing to do with Haskell.
The previous scheme used the attribute 'ghc' to accomplish the same thing, but
unfortunately other packages to contain a 'ghc' attribute, too, like the
old-style ghc-wrapper. Including the ghc-wrapper in this environment is
pointless, obviously. The new approach filters the ghc-wrapper successfully.
* There now is full support for building Haskell packages as shared libraries
for GHC versions 7.4.2 or later. The Cabal builder recognizes the following
attributes:
- enableSharedLibraries configures Cabal to build of shared libraries in
addition to static ones. This option requires that all dependencies of
the package have been compiled for use in shared libraries, too.
- enableSharedExecutables configures Cabal to prefer shared libraries when
linking executables.
The default values for these attributes are arguments to the haskellPackages
expression.
* Haskell builds now run in a LANG="en_US.UTF-8" environment to avoid plenty
of build and test suite errors. Without this setting, GHC seems unable to
deal with the UTF-8 character encoding that's generally considered standard
in the Haskell world.
* The Cabal builder supports a new attribute 'testTarget' to specify the exact
set of tests to be run during the check phase.
* The ghc-wrapper attribute ghcVersion has been removed. Instead, we use the
ghc.version attribute, which exists in unwrapped GHC derivations, too.