You can subscribe to this list here.
| 2009 |
Jan
(2) |
Feb
(5) |
Mar
|
Apr
|
May
(2) |
Jun
(8) |
Jul
(4) |
Aug
|
Sep
|
Oct
(2) |
Nov
(6) |
Dec
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2010 |
Jan
(1) |
Feb
(1) |
Mar
(3) |
Apr
(2) |
May
(2) |
Jun
(2) |
Jul
(18) |
Aug
(13) |
Sep
(7) |
Oct
|
Nov
|
Dec
(2) |
| 2011 |
Jan
|
Feb
(11) |
Mar
|
Apr
(4) |
May
|
Jun
(1) |
Jul
(18) |
Aug
(16) |
Sep
(12) |
Oct
(12) |
Nov
(19) |
Dec
(42) |
| 2012 |
Jan
(16) |
Feb
(3) |
Mar
(8) |
Apr
(14) |
May
(30) |
Jun
(5) |
Jul
(7) |
Aug
(3) |
Sep
(10) |
Oct
(4) |
Nov
(10) |
Dec
(1) |
| 2013 |
Jan
(14) |
Feb
(8) |
Mar
(5) |
Apr
(3) |
May
(9) |
Jun
(19) |
Jul
|
Aug
(27) |
Sep
(5) |
Oct
(18) |
Nov
(12) |
Dec
(8) |
| 2014 |
Jan
(5) |
Feb
(8) |
Mar
(20) |
Apr
(22) |
May
(28) |
Jun
(9) |
Jul
(6) |
Aug
(46) |
Sep
(40) |
Oct
(15) |
Nov
(8) |
Dec
(34) |
| 2015 |
Jan
(20) |
Feb
(15) |
Mar
(18) |
Apr
(20) |
May
(3) |
Jun
(13) |
Jul
(10) |
Aug
(19) |
Sep
(8) |
Oct
(31) |
Nov
(26) |
Dec
(13) |
| 2016 |
Jan
(13) |
Feb
(4) |
Mar
(14) |
Apr
(28) |
May
(19) |
Jun
(7) |
Jul
(1) |
Aug
|
Sep
(19) |
Oct
(5) |
Nov
(4) |
Dec
(9) |
| 2017 |
Jan
(4) |
Feb
(30) |
Mar
|
Apr
(5) |
May
(1) |
Jun
(1) |
Jul
(3) |
Aug
(2) |
Sep
(11) |
Oct
(3) |
Nov
(1) |
Dec
(6) |
| 2018 |
Jan
(5) |
Feb
(12) |
Mar
(5) |
Apr
(12) |
May
(22) |
Jun
(86) |
Jul
(7) |
Aug
(5) |
Sep
(6) |
Oct
(17) |
Nov
(1) |
Dec
(3) |
| 2019 |
Jan
(17) |
Feb
(4) |
Mar
(2) |
Apr
(7) |
May
(1) |
Jun
(2) |
Jul
(7) |
Aug
(9) |
Sep
|
Oct
(11) |
Nov
(20) |
Dec
(24) |
| 2020 |
Jan
(13) |
Feb
(1) |
Mar
(9) |
Apr
(2) |
May
(6) |
Jun
(6) |
Jul
(4) |
Aug
(2) |
Sep
(4) |
Oct
(1) |
Nov
(2) |
Dec
(6) |
| 2021 |
Jan
(10) |
Feb
(49) |
Mar
(26) |
Apr
(2) |
May
(1) |
Jun
|
Jul
(4) |
Aug
(6) |
Sep
|
Oct
(8) |
Nov
(5) |
Dec
(11) |
| 2022 |
Jan
|
Feb
|
Mar
(14) |
Apr
(19) |
May
(14) |
Jun
(4) |
Jul
|
Aug
|
Sep
(6) |
Oct
(4) |
Nov
|
Dec
(1) |
| 2023 |
Jan
|
Feb
(4) |
Mar
(6) |
Apr
|
May
|
Jun
(6) |
Jul
|
Aug
|
Sep
(13) |
Oct
(1) |
Nov
|
Dec
(16) |
| 2024 |
Jan
(66) |
Feb
(13) |
Mar
(5) |
Apr
(4) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(1) |
| 2025 |
Jan
|
Feb
|
Mar
(32) |
Apr
(3) |
May
(8) |
Jun
(5) |
Jul
|
Aug
(7) |
Sep
|
Oct
(2) |
Nov
(11) |
Dec
|
| S | M | T | W | T | F | S |
|---|---|---|---|---|---|---|
|
|
|
|
|
1
|
2
|
3
|
|
4
|
5
|
6
|
7
|
8
|
9
|
10
(2) |
|
11
(1) |
12
(2) |
13
|
14
|
15
|
16
|
17
(1) |
|
18
|
19
|
20
(4) |
21
|
22
|
23
|
24
|
|
25
|
26
|
27
|
28
|
29
|
30
|
|
|
From: Rainer S. <rai...@gm...> - 2012-11-20 20:59:30
|
While looking for a way to avoid a new checkout, I found that (a) no way except a fresh checkout (svn switch won't work because the new repository has a different UUID) (b) Do NOT commit anything to the old repository. For some reason the old svn repository is still there and seems to be active. (Found at the very bottom of http://forum.freegamedev.net/viewtopic.php?f=48&t=3294 So better lock the old repo. BTW: the new repo URL goes via ssh, I hope that speeds things up. Rainer On Tue, 20 Nov 2012 at 20:27 -0000, Arthur Norman wrote: > The Reduce stuff at sourceforge is being upgraded to the new arrangements > that Sourceforge have. Among other things this is expected to bring a > benefit in that it will remain possible to run a wiki there. But there is > a cost in that the address for checking the code out changes. So those of > you who have grabbed a copy using subversion are liable to need to look on > the "code" tab of the sourceforge page as explained below and check out > afresh with the new URL. As I send this the status is listed as > "status: importing" so one can not fetch the new version just yet - but > possibly by the time you read this message it will be there. I think that > when Sourceforge changes we sort of have to track with it, so I hope this > will not cause anybody pain! > Arthur > > > ---------- Forwarded message ---------- > Date: Tue, 20 Nov 2012 20:07:03 +0000 > From: SourceForge.net <nor...@in...> > Reply-To: no...@in... > To: no...@in... > Subject: SourceForge Project Upgrade Notification > > Your project, reduce, has been upgraded. > > Your source code repositories are currently being migrated to the new > setup. You will receive another email when that import is complete. That > means that you and any other developers should do a fresh checkout using > the new repository location when it is ready (see the "code" tab). Please > be aware that large repositories may take a long time. > > Please report any issues to us at > https://sourceforge.net/p/forge/site-support/new/ Thanks! > > ------------------------------------------------------------------------------ > Monitor your physical, virtual and cloud infrastructure from a single > web console. Get in-depth insight into apps, servers, databases, vmware, > SAP, cloud infrastructure, etc. Download 30-day Free Trial. > Pricing starts from $795 for 25 servers or applications! > http://p.sf.net/sfu/zoho_dev2dev_nov > _______________________________________________ > Reduce-algebra-developers mailing list > Red...@li... > https://lists.sourceforge.net/lists/listinfo/reduce-algebra-developers > Rainer Schöpf |
|
From: Arthur N. <ac...@ca...> - 2012-11-20 20:27:59
|
The Reduce stuff at sourceforge is being upgraded to the new arrangements
that Sourceforge have. Among other things this is expected to bring a
benefit in that it will remain possible to run a wiki there. But there is
a cost in that the address for checking the code out changes. So those of
you who have grabbed a copy using subversion are liable to need to look on
the "code" tab of the sourceforge page as explained below and check out
afresh with the new URL. As I send this the status is listed as
"status: importing" so one can not fetch the new version just yet - but
possibly by the time you read this message it will be there. I think that
when Sourceforge changes we sort of have to track with it, so I hope this
will not cause anybody pain!
Arthur
---------- Forwarded message ----------
Date: Tue, 20 Nov 2012 20:07:03 +0000
From: SourceForge.net <nor...@in...>
Reply-To: no...@in...
To: no...@in...
Subject: SourceForge Project Upgrade Notification
Your project, reduce, has been upgraded.
Your source code repositories are currently being migrated to the new
setup. You will receive another email when that import is complete. That
means that you and any other developers should do a fresh checkout using
the new repository location when it is ready (see the "code" tab). Please
be aware that large repositories may take a long time.
Please report any issues to us at
https://sourceforge.net/p/forge/site-support/new/ Thanks!
|
|
From: Rainer S. <rai...@gm...> - 2012-11-20 17:34:39
|
Hello Raffaele, On Tue, 20 Nov 2012 at 12:29 +0100, Raffaele Vitolo wrote: > Dear All, > > I successfully compiled and ran Reduce on Scientific Linux version 5.4. > This is the Linux flavour developed and used by CERN, Fermilab, etc.. > The packages that allowed me to compile successfully were, besides compiler: > > pth > pth-dev > ncurses > ncurses-devel > libXft > libXft-devel > fontconfig > fontconfig-devel > xorg-x11-proto-devel That's good to hear! > Maybe it is worth to add it to reduce-wiki; I can do it if I have > permissions (actually it seems I don't). Hmm, as far as I can see, you need to be a member of the "editors" user groups. I don't have the necessary rights to change this. > By the way, it would be nice to have Linux deb and rpm plus a Windows > installer; this would help increasing the Reduce user base. Is there any > work-in-progress on this topic? Yes. There are directories debianbuild and winbuild directly below trunk, each of them containing a README and a Makefile. For .deb and .rpm, cd to debianbuild and run make (provided you have all the necessary packages installed, see debianbuild/README and debianbuild/reduce/debian/control). The .deb are built and then converted to .rpm using alien. I have installed these .rpm on OpenSuse only, but I don't expect problems on Redhat or Scientific Linux. The Windows installer is work in progress. The Makefile assembles all necessary files and generates an input file for Inno Setup, with which you can create the installable package. This works, but there are a couple of open issues. > I could help with deb, maybe also rpm. It would help if you could try to build and install the .deb and .rpm files and test the Reduce installation. > A > preliminary deb (not by me) can be found here: > http://www.getdeb.net/updates/Ubuntu/12.04/?q=reduce > I think that some work should be done on it in order to introduce it in > Debian. After this step, the package would probably spread across all > Debian-based distributions. If you know someone who would be willing to take this up, we'd be grateful. I'm afraid that nowadays it is rather difficult to become a debian package maintainer. Rainer |
|
From: Raffaele V. <raf...@un...> - 2012-11-20 11:27:19
|
Dear All, I successfully compiled and ran Reduce on Scientific Linux version 5.4. This is the Linux flavour developed and used by CERN, Fermilab, etc.. The packages that allowed me to compile successfully were, besides compiler: pth pth-dev ncurses ncurses-devel libXft libXft-devel fontconfig fontconfig-devel xorg-x11-proto-devel Maybe it is worth to add it to reduce-wiki; I can do it if I have permissions (actually it seems I don't). By the way, it would be nice to have Linux deb and rpm plus a Windows installer; this would help increasing the Reduce user base. Is there any work-in-progress on this topic? I could help with deb, maybe also rpm. A preliminary deb (not by me) can be found here: http://www.getdeb.net/updates/Ubuntu/12.04/?q=reduce I think that some work should be done on it in order to introduce it in Debian. After this step, the package would probably spread across all Debian-based distributions. Best wishes, raf. -- Raffaele Vitolo, Dipartimento di Matematica e Fisica 'E. De Giorgi' Universita' del Salento, via per Arnesano 73100 Lecce ITALY tel.: +39 0832 297425 (office) fax.: +39 0832 297594 home page: http://poincare.unisalento.it/vitolo |
|
From: Rainer S. <rai...@gm...> - 2012-11-17 13:59:06
|
This expression:
p2:=1/(sqrt(d-x)*sqrt(c-x)*sqrt(b-x)*sqrt(a-x)*(a*b-a*x-b*x+x**2));
takes forever to integrate.
With the new switch trintsubst (to trace only the substitutions) you see that
more and more complex substitutions are tried, without any reasonable result:
load_package int;
on trintsubst;
int(p2,x);
As you can see, at one point there appears the square root of polynomials of
hyperbolic functions, which isn't a sensible way to go, in my opinion.
I added another new switch nointsubst to switch off the substitution attempts
entirely. With "on nointsubst", the integral is returned unevaluated within a
very short time.
I think this needs some heuristics to stop these substitutions when they lead to
overly complicated integrands. Maybe stop when surds or exponentials with
transcendental functions appear.
Any ideas?
Rainer
|
|
From: Arthur N. <ac...@ca...> - 2012-11-12 16:07:35
|
> Hi Arthur,
>
> One the struggles I've faced since working on the open source version of
> AXIOM is getting free Lisp systems work flawlessly everywhere a C++
> is available. Working on AXIOM only has strengthen my views on Lisp
> (including Common Lisp), but in ways Lisp enthusiasts would not welcome...
>
> I would be interested in a portable, open source, efficient Lisp system
> that integrates pretty well C++ -- my primary interest in this is for OpenAxiom
> which is already a mixture of Spad, C++, and Lisp.
>
> Best,
>
> -- Gaby
>
Thanks - and of course you may be aware that in the distant past that my
Lisp was the one that NAG delivered Axiom on. For that work I put in "just
enough" Common Lispisms to allow Axiom to run, since I started off with a
less complicated Lisp dialect. But in what I am doing now I am trying to
roll at least a lot of those bits into the main trunk of all I do.
I will exchange separate emails with you away from the general lists to
discuss your views on Lisp and mine, and see if we have a practical way
ahead...
Arthur
|
|
From: Gabriel D. R. <gd...@in...> - 2012-11-12 12:41:57
|
On Sun, Nov 11, 2012 at 4:11 PM, Arthur Norman <ac...@ca...> wrote: > The CSL Lisp system was developed in around 1991-2, and a report on it was > published at DISCO '93 - then appearing in J Symb Comp in 1995. So it is > around 20 years old. So I have been looking at a fairly major re-work of > lots of it. This message is to check if anybody has time, energy and > inclination to join in or at least advise. I am not putting my fragments > on sourceforge at this stage because they are a bit fragmentary thus far! > > My road map so far has a whole bunch of changes as from CSL: > > (1) Sources in C++ not C. Well as many as possible are unchanged as much > as possible, but all the files are *.cpp and are compiled using g++. > (2) I have designed a conservative incremental garbage collector that will > make putting in native compilation much easier than it was when I needed > to live with a precise garbage collector and C-compatible code. > (3) There is also the bulk of the design of how to make the Lisp support > threads. The current question I need to resolve is what to do about > property lists... > (4) Despite reservations about it, the Common Lisp-like aspects of the new > Lisp will be more to the fore than they were with CSL. > (5) The GUI will use wxWidgets not FOX, which among other things will let > it support the Macintosh directly rather than just via X11. > > The current state is that I have the first 60K lines of C from CSL sort of > adapted, and I can sometimes do read-eval-print on simple things. But > there is a lot to do and a bit of help or encouragement would be jolly > welcome... Anybody keen to join in? > > Arthur Hi Arthur, One the struggles I've faced since working on the open source version of AXIOM is getting free Lisp systems work flawlessly everywhere a C++ is available. Working on AXIOM only has strengthen my views on Lisp (including Common Lisp), but in ways Lisp enthusiasts would not welcome... I would be interested in a portable, open source, efficient Lisp system that integrates pretty well C++ -- my primary interest in this is for OpenAxiom which is already a mixture of Spad, C++, and Lisp. Best, -- Gaby |
|
From: Arthur N. <ac...@ca...> - 2012-11-11 22:12:28
|
The CSL Lisp system was developed in around 1991-2, and a report on it was
published at DISCO '93 - then appearing in J Symb Comp in 1995. So it is
around 20 years old. So I have been looking at a fairly major re-work of
lots of it. This message is to check if anybody has time, energy and
inclination to join in or at least advise. I am not putting my fragments
on sourceforge at this stage because they are a bit fragmentary thus far!
My road map so far has a whole bunch of changes as from CSL:
(1) Sources in C++ not C. Well as many as possible are unchanged as much
as possible, but all the files are *.cpp and are compiled using g++.
(2) I have designed a conservative incremental garbage collector that will
make putting in native compilation much easier than it was when I needed
to live with a precise garbage collector and C-compatible code.
(3) There is also the bulk of the design of how to make the Lisp support
threads. The current question I need to resolve is what to do about
property lists...
(4) Despite reservations about it, the Common Lisp-like aspects of the new
Lisp will be more to the fore than they were with CSL.
(5) The GUI will use wxWidgets not FOX, which among other things will let
it support the Macintosh directly rather than just via X11.
The current state is that I have the first 60K lines of C from CSL sort of
adapted, and I can sometimes do read-eval-print on simple things. But
there is a lot to do and a bit of help or encouragement would be jolly
welcome... Anybody keen to join in?
Arthur
|
|
From: Arthur N. <ac...@ca...> - 2012-11-10 19:52:37
|
>
> What happens here is that the sfgamma package is autoloaded while the above rule
> is being processed. The toplevel code of the sfgamma package does algebraic
> mode assignments, which do not work while a rule is being defined.
>
>
> Possible solutions:
>
> 1. Rewrite the sfgamma package to eliminate dangerous assignments at load time.
>
> 2. Define the gamma function and its friends as operators in the Reduce core,
> load sfgamma only when computations with gamma, etc. are done.
>
> I think it would be better to define all the special operators from the specfn
> and sfgamma packages in the core. Another case are si and ci, which are defined
> when either specfn or int is loaded.
>
> Rainer
>
I would (I think) like it if loading packages always happened
automatically when code was called from them, but that merely whether a
package was loaded or not never changed any global variables or behaviour.
Any changes of behaviour out to be controlled by "on someflag;" and for
every flag "off someflag;" should restore the previous behaviour. The idea
of modules being loadable was important when computers did not have enough
memory to make it feasible to have all of Reduce resident at once. These
days it would seem plausible to think of having Reduce have everything
loaded from the start...
However the incremental development of Reduce over time means that where
we are now is that some modules redefine basic functions, rather more set
flags that alter behaviour globally and my ideal is some way off. =
But migrating operator declarations and the like to the core seems to me
to me a step in the right direction.
Arthur
|
|
From: Rainer S. <rai...@gm...> - 2012-11-10 17:29:13
|
In a freshly started Reduce: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let int(gamma(~~a+~~b*~x)^~n*polyGamma(0,~~a+~~b*~x),x) => gamma(a+b*x)^n/(b*n); ***** positive numeric value or `RESET' required ***** module sfpsi of package sfgamma cannot be loaded <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< What happens here is that the sfgamma package is autoloaded while the above rule is being processed. The toplevel code of the sfgamma package does algebraic mode assignments, which do not work while a rule is being defined. Possible solutions: 1. Rewrite the sfgamma package to eliminate dangerous assignments at load time. 2. Define the gamma function and its friends as operators in the Reduce core, load sfgamma only when computations with gamma, etc. are done. I think it would be better to define all the special operators from the specfn and sfgamma packages in the core. Another case are si and ci, which are defined when either specfn or int is loaded. Rainer |