- Added a scanner program to detect possibly unused code.
(Other programs that can do similar stuff seem to hickup on the nested functions.)
This commit is contained in:
parent
eaa6ef1345
commit
68047b596a
@ -1,7 +1,5 @@
|
|||||||
- De-classification does not work as desired. The name Alice is used
|
- (!!) Are the Arachne rules for keys that are variables sane? E.g. what
|
||||||
even if it already occurs somehwere in the system, which is not what
|
is their inverse key? Check!
|
||||||
we want.
|
|
||||||
- Heuristic could also punish more initiators.
|
|
||||||
- If no attack/state output is needed, maybe the attack heuristic should
|
- If no attack/state output is needed, maybe the attack heuristic should
|
||||||
be simpler (which means just weighting the trace length etc.) in order
|
be simpler (which means just weighting the trace length etc.) in order
|
||||||
to avoid uneccesary continuation of the search. Maybe even stop
|
to avoid uneccesary continuation of the search. Maybe even stop
|
||||||
@ -13,10 +11,9 @@
|
|||||||
- The trace incremental search should start at something like 'distance of
|
- The trace incremental search should start at something like 'distance of
|
||||||
first claim + SOMECONST * runs'. SOMECONST relates to the partial-order
|
first claim + SOMECONST * runs'. SOMECONST relates to the partial-order
|
||||||
induced trace prolonings.
|
induced trace prolonings.
|
||||||
- For now, CLP is non-working and not checked. Reconsider this later.
|
|
||||||
- Because the properties checked are related to the partial order reductions,
|
- Because the properties checked are related to the partial order reductions,
|
||||||
it makes sense to check 'all secrecy claims' or 'all synchronisation claims'
|
it makes sense to check 'all secrecy claims' or 'all synchronisation claims'
|
||||||
at once.
|
at once. This holds in a similar way for arachne.
|
||||||
- Given originator assumptions and such, we might want to implement the
|
- Given originator assumptions and such, we might want to implement the
|
||||||
compiler, that for non-typed matching, it will ignore all type definitions
|
compiler, that for non-typed matching, it will ignore all type definitions
|
||||||
in the code (as in, putting them in stype, but still considering Function).
|
in the code (as in, putting them in stype, but still considering Function).
|
||||||
@ -25,29 +22,7 @@
|
|||||||
- In a paper concerning the CASRUL compiler, they mention associativity for
|
- In a paper concerning the CASRUL compiler, they mention associativity for
|
||||||
the equality relations. Note that CASRUL restricts the intruder actions,
|
the equality relations. Note that CASRUL restricts the intruder actions,
|
||||||
and sometimes uses approximations.
|
and sometimes uses approximations.
|
||||||
- MS and CE never mention associativity as a restriction.
|
|
||||||
- For now, clp stuff will use pairing and simple mgu. Maybe a -m3 switch will
|
|
||||||
add the associative variant later.
|
|
||||||
- MALLOC_TRACE is the environment variable that triggers mtrace() output.
|
|
||||||
- Several memory leak detectors are still in place, they can all be located by
|
- Several memory leak detectors are still in place, they can all be located by
|
||||||
searching for "lead" or "diff" and destroying the code.
|
searching for "lead" or "diff" and destroying the code.
|
||||||
- knowledgeAddTerm might be improved by scanning through key list only with
|
- knowledgeAddTerm might be improved by scanning through key list only with
|
||||||
things that are newly added.
|
things that are newly added.
|
||||||
- Associativity and unification don't play together well; consider other
|
|
||||||
algorithms. Maybe a simple explore (x,y),z and explore x,(y,z) will do, e.g.
|
|
||||||
take any split point for both sequences.
|
|
||||||
- POR (-t3) is actually worse than -t2: sends first. After some pondering I
|
|
||||||
now know why. It's because simultaneous reads yield traces in which there
|
|
||||||
are attacks that are also in the ones where we have R0S0S0;R1 (i.e. is thus
|
|
||||||
equivalent with R0;R1;S0S0)
|
|
||||||
- Knowledge is monotonous. So, if we say 'R#0' cannot be triggered by some
|
|
||||||
knowledge, we can just use a pointer for it. If we do want to trigger it, we
|
|
||||||
need to compare two knowledge pointers. As they don't have a unique
|
|
||||||
representation, it is more difficult. For now, I will just forbid specific
|
|
||||||
branches of the read when a message is in some previous knowledge, which is
|
|
||||||
more precise.
|
|
||||||
- Comparisons with Casper are in order.
|
|
||||||
- It is a good idea to have any number of input files, and parse them all, to
|
|
||||||
combine scenarios. A command should be "require <file>", because that would
|
|
||||||
enable scenarios to load the required role definitions. It's require instead
|
|
||||||
of include, because of possible multiple occurrences.
|
|
||||||
|
82
src/scantags.py
Executable file
82
src/scantags.py
Executable file
@ -0,0 +1,82 @@
|
|||||||
|
#!/usr/bin/python
|
||||||
|
|
||||||
|
import commands
|
||||||
|
|
||||||
|
class Tag(object):
|
||||||
|
"""
|
||||||
|
Object for tag (ctag line)
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self,tagline):
|
||||||
|
tl = tagline.strip().split('\t')
|
||||||
|
self.tag = tl[0]
|
||||||
|
self.filename = tl[1]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.tag
|
||||||
|
|
||||||
|
class GrepRes(object):
|
||||||
|
"""
|
||||||
|
Object for a result line from grep
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self,line):
|
||||||
|
self.line = line
|
||||||
|
x = line.find(":")
|
||||||
|
if x:
|
||||||
|
self.filename = line[:x]
|
||||||
|
self.text = line[x:].strip()
|
||||||
|
else:
|
||||||
|
self.filename = None
|
||||||
|
self.text = None
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.line
|
||||||
|
|
||||||
|
|
||||||
|
def outToRes(out,filter=[]):
|
||||||
|
"""
|
||||||
|
filter grep output and make a list of GrepRes objects. Filter out
|
||||||
|
any that come from the filenames in the filter list. Also return the
|
||||||
|
count of all results (not taking the filter into account).
|
||||||
|
"""
|
||||||
|
|
||||||
|
reslist = []
|
||||||
|
count = 0
|
||||||
|
for l in out.splitlines():
|
||||||
|
gr = GrepRes(l)
|
||||||
|
if gr.filename not in filter:
|
||||||
|
reslist.append(gr)
|
||||||
|
count = count+1
|
||||||
|
return (reslist,count)
|
||||||
|
|
||||||
|
def gettags():
|
||||||
|
"""
|
||||||
|
Get all the tags in a list
|
||||||
|
"""
|
||||||
|
|
||||||
|
f = open("tags","r")
|
||||||
|
tags = []
|
||||||
|
for l in f.readlines():
|
||||||
|
if not l.startswith("!"):
|
||||||
|
tags.append(Tag(l))
|
||||||
|
f.close()
|
||||||
|
return tags
|
||||||
|
|
||||||
|
def tagoccurs(tag):
|
||||||
|
"""
|
||||||
|
Check tag occurrences in .c and .h files and show interesting ones.
|
||||||
|
"""
|
||||||
|
|
||||||
|
cmd = "grep \"\\<%s\\>\" *.[ch]" % tag
|
||||||
|
(reslist,count) = outToRes(commands.getoutput(cmd),[tag.filename])
|
||||||
|
if (len(reslist) == 0) and (count < 2):
|
||||||
|
print "\"%s\" seems to occur only %i times in %s" % (tag,count,tag.filename)
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
tags = gettags()
|
||||||
|
for t in tags:
|
||||||
|
tagoccurs(t)
|
||||||
|
|
||||||
|
main()
|
Loading…
Reference in New Issue
Block a user