Opened 4 years ago
Closed 4 years ago
#19777 closed enhancement (fixed)
Three new SRGs and db update
Reported by:  ncohen  Owned by:  

Priority:  major  Milestone:  sage7.0 
Component:  graph theory  Keywords:  
Cc:  dimpase  Merged in:  
Authors:  Nathann Cohen, Dima Pasechnik  Reviewers:  Nathann Cohen, Dima Pasechnik 
Report Upstream:  N/A  Work issues:  
Branch:  29f123c (Commits)  Commit:  29f123c10658cce8b05306738fcf720d5d869828 
Dependencies:  Stopgaps: 
Description
Andries Brouwer's database has been updated, and this ticket forwards the changes to Sage's copy of it.
4 graphs have been added to it, 3 of which are built from 2intersection codes. The last one comes from a reference I haven't been able to find yet.
Nathann
http://www.steinertriples.fr/ncohen/tmp/graphs20151224.tar.bz2
Change History (57)
comment:1 Changed 4 years ago by
 Branch set to public/19777
 Status changed from new to needs_review
comment:2 Changed 4 years ago by
 Commit set to bb930ca4c4e9314670f4b64aab956c6e4eeb6e05
comment:3 Changed 4 years ago by
I think I already mentioned on another ticket that 2intersection codes and 2distance codes are (almost) the same thing. Why don't you add these 2intersection codes to the corr. db, instead of hardcoding them into strongly_regular_graphs_db?
comment:4 Changed 4 years ago by
a 2intersection code is a projective 2weight code:
e.g. take L from SRG_1024_363_122_132()
and do
sage: K = GF(2**2,'x') sage: x = K.gens()[0] sage: L = [map({'0':0,'1':1,'a':x,'b':x**2}.get,xx) for xx in L] sage: L = Matrix(K,map(list,L)).transpose() sage: c = LinearCode(L); c Linear code of length 121, dimension 5 over Finite Field in x of size 2^2 sage: [w for w,f in enumerate(c.weight_distribution()) if w and f] [88, 96] sage: c.weight_enumerator() x^121 + 660*x^33*y^88 + 363*x^25*y^96
so you have 0 word, 660 words of weight 88, and 363 of weight 96.
comment:5 followup: ↓ 6 Changed 4 years ago by
Hmmmm.. I used your code and generated the graph from the two weights code. For some reason it takes infinitely more time.
sage: %time strongly_regular_from_two_weight_code(c) CPU times: user 19 s, sys: 104 ms, total: 19.1 s Wall time: 18.5 s Graph on 1024 vertices
comment:6 in reply to: ↑ 5 ; followup: ↓ 7 Changed 4 years ago by
Replying to ncohen:
Hmmmm.. I used your code and generated the graph from the two weights code. For some reason it takes infinitely more time.
well, all I am saying is that these matrices etc should go into the database; it's another story how to use them  nothing prevents you from calling the faster function on them...
comment:7 in reply to: ↑ 6 ; followup: ↓ 8 Changed 4 years ago by
well, all I am saying is that these matrices etc should go into the database; it's another story how to use them  nothing prevents you from calling the faster function on them...
I understand, yet what you ask represents a big amount of work if you insist to make me do all of it.
comment:8 in reply to: ↑ 7 ; followup: ↓ 9 Changed 4 years ago by
Replying to ncohen:
well, all I am saying is that these matrices etc should go into the database; it's another story how to use them  nothing prevents you from calling the faster function on them...
I understand, yet what you ask represents a big amount of work if you insist to make me do all of it.
I can do the work, it's not a problem (although I currently have keyboard/mouse elbow, and should not do much typing for a while), but do you agree that by right this data must be in the coding/
?
comment:9 in reply to: ↑ 8 Changed 4 years ago by
I can do the work, it's not a problem (although I currently have keyboard/mouse elbow, and should not do much typing for a while), but do you agree that by right this data must be in the
coding/
?
Yeah yeah that's where it should be. At the moment I am trying to figure out what exactly needs to be done, and how. Turns out that the difference between a twoweight code and a two intersection set is merely a matrix transposition, is it?
Nathann
comment:10 Changed 4 years ago by
sage: from sage.coding.two_weight_db import data sage: %time strongly_regular_from_two_intersection_set(data[1]['M'].transpose()) CPU times: user 452 ms, sys: 24 ms, total: 476 ms Wall time: 436 ms Graph on 625 vertices sage: %time strongly_regular_from_two_weight_code(data[1]['M']) CPU times: user 1.68 s, sys: 16 ms, total: 1.7 s Wall time: 1.67 s Graph on 625 vertices
comment:11 followup: ↓ 12 Changed 4 years ago by
Okay, everything looks fairly straightforward. The only problem is with the SRG parameter 'guessing' from the two_weight codes. If we do the operation through the matrix's transpose I have no idea what parameters we end up with.
Nathann
comment:12 in reply to: ↑ 11 Changed 4 years ago by
Replying to ncohen:
Okay, everything looks fairly straightforward. The only problem is with the SRG parameter 'guessing' from the two_weight codes. If we do the operation through the matrix's transpose I have no idea what parameters we end up with.
It seems that k will be one of eigenvalue multiplicities of the "usual" graph. I.e. often the same, e.g. for data on this ticket they are the same, but not always  that's where formulae on Chen's page go wrong. One of these f, g:
# multiplicity of eigenvalues 'r,s' (f=lambda_r, g=lambda_s) # # They are integers (checked by the 'integrality condition'). f = k*(s+1)*(ks)/((k+r*s)*(rs)) g = k*(r+1)*(kr)/((k+r*s)*(rs))
Details are in http://pages.uoregon.edu/kantor/PAPERS/2WeightCodes.pdf.
comment:13 followup: ↓ 14 Changed 4 years ago by
I still don't see a way to do the job cleanly.
comment:14 in reply to: ↑ 13 ; followup: ↓ 15 Changed 4 years ago by
Replying to ncohen:
I still don't see a way to do the job cleanly.
Do you mean a procedure to give possible parameters of a projective 2weight code from (v,k,l,mu)? Anyway, a part of the job is to move these L's and the related info into the twoweight code database.
comment:15 in reply to: ↑ 14 ; followup: ↓ 16 Changed 4 years ago by
Do you mean a procedure to give possible parameters of a projective 2weight code from (v,k,l,mu)?
I mean a way to use the "SRG from 2 intersection" routine on the twoweight code instead of the current function we use. It's much faster, but if we do that we don't know how to predict which parameters we will end up with.
Anyway, a part of the job is to move these L's and the related info into the twoweight code database.
True but that leaves an unpleasant feeling of just not doing what needs to be done.
Nathann
comment:16 in reply to: ↑ 15 ; followup: ↓ 18 Changed 4 years ago by
Replying to ncohen:
Do you mean a procedure to give possible parameters of a projective 2weight code from (v,k,l,mu)?
I mean a way to use the "SRG from 2 intersection" routine on the twoweight code instead of the current function we use. It's much faster, but if we do that we don't know how to predict which parameters we will end up with.
I thought we don't have this implemented for the twoweight code graph on its words, do we?
Here we are in a better shape: there are formulas from Cor.3.7 of CalderbankKantor, written also on on Eric Chen's pages. My understanding is that they give the parameters of "SRG from 2 intersection". (or perhaps its complement  this is easy to figure out). This will answer the question
"is there a 2intersection proj. set giving (v,k,l,mu)SRG". (A)
To get the parameters of the graph of codewords of the corresponding 2weight code, one needs the duality of assoc. schemes (Thm 5.7 and the discussion after it in CalderbankKantor). It is easy to compute them directly from graph parameters (via eigenvalues etc). I'd be happy to provide an implementation of it. Thus, to answer the question
"is there a 2weight code giving (v,k,l,mu)SRG" (B),
one will compute the parameters (v,k',l',mu') of the dual (note that A==dual(dual(A)), and answer (A) for (v,k',l',mu'). Then use Thm 3.1 (p.100, same text), which tells one how to switch between (n,k,h_1,h_2) and parameters of the corresponding 2weight code.
Anyway, a part of the job is to move these L's and the related info into the twoweight code database.
True but that leaves an unpleasant feeling of just not doing what needs to be done.
above is the plan, what do you think about it?
Dima
comment:17 Changed 4 years ago by
 Commit changed from bb930ca4c4e9314670f4b64aab956c6e4eeb6e05 to 4d2c2f4496c95fa6b6cf2187747d62e3609677d3
Branch pushed to git repo; I updated commit sha1. This was a forced push. New commits:
4d2c2f4  trac #19777: Three new SRGs and db update

comment:18 in reply to: ↑ 16 Changed 4 years ago by
above is the plan, what do you think about it?
Here is my answer, as an admittedly cleaner commit.
If you know how to find your way in order to make strongly_regular_from_two_weight_code(L)
as fast as strongly_regular_from_two_intersection_set(L.transpose())
, we will definitely have gained something O_o
Nathann
comment:19 Changed 4 years ago by
 Commit changed from 4d2c2f4496c95fa6b6cf2187747d62e3609677d3 to 93bd35710e758924c7015a1b417eba94eb2d8bd5
Branch pushed to git repo; I updated commit sha1. New commits:
93bd357  trac #19777: Convert the other 2intersection set

comment:20 followup: ↓ 21 Changed 4 years ago by
I converted the old SRG from Disset into a code too. Cleaner. Slower, but cleaner.
comment:21 in reply to: ↑ 20 ; followup: ↓ 22 Changed 4 years ago by
Replying to ncohen:
I converted the old SRG from Disset into a code too. Cleaner. Slower, but cleaner.
Why has it got slower? I don't follow you here, sorry.
comment:22 in reply to: ↑ 21 ; followup: ↓ 24 Changed 4 years ago by
Why has it got slower? I don't follow you here, sorry.
Because building a graph using the matrix takes a different time if you go through a twoweigth code of through the equivalent 2intersection set. That's what I was saying in 18 and in 10.
That's the thing that disturbs me. If we can make one function as fast as the other, then everything is fine.
Nathann
comment:23 Changed 4 years ago by
(and the time differences increases quickly with the size of the graph you build)
comment:24 in reply to: ↑ 22 Changed 4 years ago by
Replying to ncohen:
Why has it got slower? I don't follow you here, sorry.
Because building a graph using the matrix takes a different time if you go through a twoweigth code of through the equivalent 2intersection set. That's what I was saying in 18 and in 10.
That's the thing that disturbs me. If we can make one function as fast as the other, then everything is fine.
but why do you build it via a slower function, despite a faster function being available?
comment:25 Changed 4 years ago by
............. because you made me do it ?..
comment:26 followup: ↓ 27 Changed 4 years ago by
What the hell man, I had those four functions in the strongly_regular_db file and you asked me to add them inside of the twoweight database. Now they are in he twoweight database and the SRG that correspond are built as they always were, through the "SRG from 2weight code" function.
I thought that you said you knew how to guess the parameters of the SRG that would be obtained when using the "SRG from 2intersection code". Because that's the problem, in case it needs be said again: I cannot replace the first function by the other, for I don't know what exactly the parameters of the SRG will be. Will I get what I expect or its complement?
If you can guess that then you can use the faster of the two functions.
Nathann
comment:27 in reply to: ↑ 26 ; followup: ↓ 28 Changed 4 years ago by
Replying to ncohen:
What the hell man, I had those four functions in the strongly_regular_db file and you asked me to add them inside of the twoweight database. Now they are in he twoweight database and the SRG that correspond are built as they always were, through the "SRG from 2weight code" function.
Sorry, I forgot that that DB gets processed a _build_small_srg_database() in its entirety. Now I looked at the code ans see that this is the case.
OK, I will speed it up. Where is the main slowness  is it in strongly_regular_from_two_weight_code
, or in the main body of _build_small_srg_database()
?
The computations in the latter can be replaced by what I explained in comment 16.
I don't know how to speed up strongly_regular_from_two_weight_code
, but in most
cases (in particular for v=1024) it can be replaced by strongly_regular_from_two_intersection_set
.
Dima
comment:28 in reply to: ↑ 27 Changed 4 years ago by
OK, I will speed it up. Where is the main slowness  is it in
strongly_regular_from_two_weight_code
, or in the main body of_build_small_srg_database()
?
In strongly_regular_from_two_weight_code
, for it performs a compuation on every pair of points in the code. It is not the case for strongly_regular_from_two_intersection_set
, as you can see from its code.
The computations in the latter can be replaced by what I explained in comment 16. I don't know how to speed up
strongly_regular_from_two_weight_code
, but in most cases (in particular for v=1024) it can be replaced bystrongly_regular_from_two_intersection_set
.
I do not mind if all that is being done is replacing the first function by the other. Actually I wouldn't mind if that were the case, the only problem is with the graph's parameters.
Or perhaps we can build one, and if it does not match return its complement. Dirty, but operationnal.
Nathann
comment:29 Changed 4 years ago by
I did checks, and out of 22 entries in the 2weight codes db, 18 are selfdual, that is
one can use strongly_regular_from_two_intersection_set
on the transpose of their matrices
(more precisely, strongly_regular_from_two_intersection_set
builds the complement of the graph,
but this is fine (we'd adjust doctests).
For the remaining 4 cases, I checked what one gets by using strongly_regular_from_two_intersection_set
on the transpose of their matrices.
2wt code of (512, 315, 202, 180) gives the graph of GQ(7,9).
2wt code of (512, 219, 106, 84) gives (512, 73, 10, 12) and correspondingly 2wt code of (512, 73, 12, 10) gives (512, 219, 106, 84) (that is, it suffices to use just one of these 2wt codes to get graphs)
2wt code of (243, 220, 199, 200) gives (243, 110, 37, 60)
(that is, SRG_243_110_37_60()
can be removed, or not just used...)
comment:30 Changed 4 years ago by
 Commit changed from 93bd35710e758924c7015a1b417eba94eb2d8bd5 to 1f08db56c47170d996c21e9ae720a717160b2093
comment:31 Changed 4 years ago by
this will need a bit more work  not 100% ready for review yet. One part of it works more by an accident than due to theory.
comment:32 followup: ↓ 33 Changed 4 years ago by
Yooooooo !
Do you mind if we move your last commit  and the optimization issue  to another ticket ? It would be cool to have those new srg, and the update of the db as well, in Sage asap.
I can split it myself if you prefer.
Thanks,
Nathann
comment:33 in reply to: ↑ 32 Changed 4 years ago by
Replying to ncohen:
Do you mind if we move your last commit  and the optimization issue  to another ticket ? It would be cool to have those new srg, and the update of the db as well, in Sage asap.
we are travelling today back to Oxford, and I know what I want to change. I think it should be ready tomorrow morning or earlier.
comment:34 Changed 4 years ago by
Okay then,
Nathann
comment:35 Changed 4 years ago by
 Commit changed from 1f08db56c47170d996c21e9ae720a717160b2093 to 205bdd5b2639d6e0f6d55d2a85c8ee23087ab718
Branch pushed to git repo; I updated commit sha1. New commits:
205bdd5  guard against runtime errors, remove useless check

comment:36 followup: ↓ 38 Changed 4 years ago by
OK, ready now :)
comment:37 Changed 4 years ago by
PS. (for a followup ticket?) I don't see why the 2wt codes database does not store the complete weight enumerator, but computes it all the time from scratch. As well, one should be aware of the corr. function:
sage: L=LinearCode(M); L Linear code of length 65, dimension 4 over Finite Field of size 5 sage: e=L.weight_enumerator(); e x^65 + 364*x^15*y^50 + 260*x^10*y^55
allows one to get all the needed info without going through the code explicitly:
sage: e.substitute(x=1) 260*y^55 + 364*y^50 + 1 sage: e.substitute(x=1).coefficients() [260, 364, 1] sage: map(lambda x:x[1], e.substitute(x=1).exponents()) [55, 50, 0]
comment:38 in reply to: ↑ 36 ; followup: ↓ 42 Changed 4 years ago by
OK, ready now :)
Could you:
1) Define what the '1s eigenmatrix' is, or point toward somewhere which defines it?
2) This function has no reason to be a 'cdef' function, given that it calls highlevel functions like 'matrix'. You don't save anything with the 'cdef', and removing it will enable you to add doctests.
3) Why should we keep SRG_243_110_37_60
if you say that we don't use it, and that we obtain the same graph by another way?
4) Could you add in the code an indication/reference for this new filter:
if 1+f+g != v: # the only other eigenvalue, k, has multiplicity 1'
All other filters should already have such a reference
5) I'm a bit worried about this new filter: does it discard any new entry? I thought that the filters we had already matched Andries Brouwer's database.
6) Instead of [function, M.transpose()]
could you use [lambda x:function(x.transpose()), M]
? The difference is that the first computes the transpose when it is run, while the other only does it when it is called.
Thanks,
Nathann
P.S.: I rebased the branch atop of the latest release, and merged your two commits
comment:39 Changed 4 years ago by
 Status changed from needs_review to needs_work
comment:40 followup: ↓ 41 Changed 4 years ago by
It seems that those two codes generate the very same two graphs
{'q': 2, 'n': 73, 'source': 'Shared by Eric Chen [ChenDB]_.', 'w2': 40, 'w1': 32, 'k': 9} {'q': 2, 'n': 219, 'source': 'Shared by Eric Chen [ChenDB]_.', 'w2': 112, 'w1': 96, 'k': 9}
comment:41 in reply to: ↑ 40 Changed 4 years ago by
Replying to ncohen:
It seems that those two codes generate the very same two graphs
{'q': 2, 'n': 73, 'source': 'Shared by Eric Chen [ChenDB]_.', 'w2': 40, 'w1': 32, 'k': 9} {'q': 2, 'n': 219, 'source': 'Shared by Eric Chen [ChenDB]_.', 'w2': 112, 'w1': 96, 'k': 9}
I suppose these are ones I mentioned in comment 29:
2wt code of (512, 219, 106, 84) gives (512, 73, 10, 12) and correspondingly 2wt code of (512, 73, 12, 10) gives (512, 219, 106, 84) (that is, it suffices to use just one of these 2wt codes to get graphs)
comment:42 in reply to: ↑ 38 ; followup: ↓ 44 Changed 4 years ago by
Replying to ncohen:
OK, ready now :)
Could you:
3) Why should we keep
SRG_243_110_37_60
if you say that we don't use it, and that we obtain the same graph by another way?
well, it's a seemingly different construction; should we move it into smallgraphs
?
4) Could you add in the code an indication/reference for this new filter:
if 1+f+g != v: # the only other eigenvalue, k, has multiplicity 1'All other filters should already have such a reference
well, it is just common sense; the adjacency matrix has 3 different eigenvalues, one of which is k, of multiplicity 1 (and this is a classical fact about eigenvalues of nonnegative matrices), and the other two are r and s, with multiplicities f and g. Naturally, one must have 1+f+g=v.
By the way, the reference for 'Integrality condition' (integrality of f) is very vague  I don't know whether it is meant to be [BCN89]_, and if yes, then where this is mentioned. And why only f is checked, but not g?
5) I'm a bit worried about this new filter: does it discard any new entry? I thought that the filters we had already matched Andries Brouwer's database.
perhaps it does not, but it is certainly correct, and takes care of integrality of g.
P.S.: I rebased the branch atop of the latest release, and merged your two commits
do you want me to keep working on the branch currently on the ticket?
comment:43 Changed 4 years ago by
 Commit changed from 205bdd5b2639d6e0f6d55d2a85c8ee23087ab718 to 5b8a3fd5cb08cdfff21ca3dce73d8e5091af6b24
comment:44 in reply to: ↑ 42 Changed 4 years ago by
well, it's a seemingly different construction; should we move it into
smallgraphs
?
Argggggg, no please not in smallgraphs.. Let's keep it where the others are.
I guess it's fine to keep it if you want: I keep forgetting that you have this idea in your head of having "several instances per set of parameters", while I'm just trying to 'save' Andries Brouwer's database.
well, it is just common sense; the adjacency matrix has 3 different eigenvalues, one of which is k, of multiplicity 1 (and this is a classical fact about eigenvalues of nonnegative matrices), and the other two are r and s, with multiplicities f and g. Naturally, one must have 1+f+g=v.
Could you add a '# multiplicities of eigenvalues' somewhere ? :P
Only because I am an illiterate idiot :P
By the way, the reference for 'Integrality condition' (integrality of f) is very vague  I don't know whether it is meant to be [BCN89]_, and if yes, then where this is mentioned. And why only f is checked, but not g?
If that bothers you, then surely you understand why I prefer to have clear references for those filters :P
perhaps it does not, but it is certainly correct, and takes care of integrality of g.
Okay okay. For the sake of symmetry.
do you want me to keep working on the branch currently on the ticket?
I am a dangerous and forgetful idiot, and I forgot to push the branch. It is done, sorry for the delay and if you have anything tricky to rebase because of it please give me the branch's name and I will do it for you.
Nathann
comment:45 Changed 4 years ago by
 Commit changed from 5b8a3fd5cb08cdfff21ca3dce73d8e5091af6b24 to 9b0c141e745a8155e6d2b030116167e3f39a5ab0
Branch pushed to git repo; I updated commit sha1. New commits:
9b0c141  from cdef to def with docs added, etc

comment:46 Changed 4 years ago by
 Reviewers set to Nathann Cohen, Dima Pasechnik
 Status changed from needs_work to needs_review
comment:47 Changed 4 years ago by
 Commit changed from 9b0c141e745a8155e6d2b030116167e3f39a5ab0 to ab7dce9d05ea183f546a251f071f3ff933af855a
Branch pushed to git repo; I updated commit sha1. New commits:
14bdcab  Trac 19778: McFarland 1973 difference set construction

8bd17be  Trac 19778: input blocks

b3dd88a  Trac 19778: code reorganization in difference_family

f01133f  trac #19778: Merged with 7.0.beta1

67bcb95  trac #19777: Merged with 7.0.beta1

ab7dce9  trac #19777: Review

comment:48 Changed 4 years ago by
 Commit changed from ab7dce9d05ea183f546a251f071f3ff933af855a to 7194f38c78dd4e1c463433175427a1e71929c4bb
comment:49 Changed 4 years ago by
Seems that I am growing more incompetent with git every day. Sorry for the mess. I cleaned it and rebased the branch again :/
Nathann
comment:50 followup: ↓ 52 Changed 4 years ago by
Some remarks about your commit:
1) Thank you for the added info in the docstring of 'eigenmatrix'. I won't lie and say that I get all of it, but I know that the fault lies in my restricted knowledge and not in the doc. I really should hit those books :/
2) replacement k+r*s==mu
 is that true in general? Is it checked anywhere in the code, and should we check it?
3) I am not sure that I understand how exactly this second eigenmatrix works. If em==cinv*emi*cinv
, are you sure that there exists a strongly regular graph whose eigenmatrix is vP^(1)
? Or is it only because we are obtaining this SRG from a 2intersection set? In general, when are you sure that the other strongly regular graph exists from only looking at the value of vP^(1)
? I wonder how general this thing is, and if it is specific to twoweight codes or if it belongs somewhere else.
Thanks,
Nathann
comment:51 followup: ↓ 53 Changed 4 years ago by
4) What would you think of leaving the SRG_
function in the index, even if that construction is overwritten by the 2weight code one? I don't think that it hurt, and at least from the point of view of the code this list is still an index of all SRG_*
functions.
comment:52 in reply to: ↑ 50 Changed 4 years ago by
Replying to ncohen:
2) replacement
k+r*s==mu
 is that true in general? Is it checked anywhere in the code, and should we check it?
Thm 9.1.3(ii) in [BH12]_ tells you how to get mu and l from r and s. In particular indeed k+r*s==mu
. It is basically a consequence of the fact that r and s are roots of a quadratic equation, with coefficients being linear functions of l and m (and then the coefficients are symmetric  see, swapping s and r doesn't change things here  functions of the roots, as every algebraist since Galois knows...)
3) I am not sure that I understand how exactly this second eigenmatrix works. If
em==cinv*emi*cinv
, are you sure that there exists a strongly regular graph whose eigenmatrix isvP^(1)
?
in this case vP^(1)
corresponds to the complement of the original graph.
As the original graph exists, its complement exists, too...
Or is it only because we are obtaining this SRG from a 2intersection set? In general, when are you sure that the other strongly regular graph exists from only looking at the value of
vP^(1)
?
It does exist if the graph has a regular group of automorphisms (in particular for graphs from 2wt codes/projective 2intersection sets this is the case). Then the dual graph comes from taking elements of this group as vertices, and some nontrivial way to define adjacency among them...
I wonder how general this thing is, and if it is specific to twoweight codes or if it belongs somewhere else.
well, yes, it is more general, but not by too much...
comment:53 in reply to: ↑ 51 Changed 4 years ago by
Replying to ncohen:
4) What would you think of leaving the
SRG_
function in the index, even if that construction is overwritten by the 2weight code one? I don't think that it hurt, and at least from the point of view of the code this list is still an index of allSRG_*
functions.
as you like...
comment:54 Changed 4 years ago by
 Commit changed from 7194f38c78dd4e1c463433175427a1e71929c4bb to 29f123c10658cce8b05306738fcf720d5d869828
Branch pushed to git repo; I updated commit sha1. New commits:
29f123c  trac #19777: Keep the useless SRG_ function in the index

comment:55 Changed 4 years ago by
Okayyyyyyyy... Good to go? :)
Nathann
comment:57 Changed 4 years ago by
 Branch changed from public/19777 to 29f123c10658cce8b05306738fcf720d5d869828
 Resolution set to fixed
 Status changed from positive_review to closed
Branch pushed to git repo; I updated commit sha1. New commits:
trac #19777: Three new SRGs and db update