mirror of
https://github.com/sqlmapproject/sqlmap.git
synced 2025-12-06 12:41:30 +00:00
Compare commits
67 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4b4f728d8e | ||
|
|
e8336ecfe1 | ||
|
|
38ea0686a8 | ||
|
|
73b0de67b5 | ||
|
|
fae97b3937 | ||
|
|
c0947846f4 | ||
|
|
5e2d0bd320 | ||
|
|
4badb54607 | ||
|
|
29aaec8925 | ||
|
|
27ff5d6fec | ||
|
|
72ff6e24ff | ||
|
|
717c451b8c | ||
|
|
e5968cae31 | ||
|
|
2b55ae3e2a | ||
|
|
8f4488d608 | ||
|
|
f1254fef4b | ||
|
|
ccda26a567 | ||
|
|
099110bc1f | ||
|
|
0265b3fcfa | ||
|
|
961d2b24d1 | ||
|
|
53578bcb7c | ||
|
|
756f02fb0e | ||
|
|
17c170e1f8 | ||
|
|
220c1be162 | ||
|
|
6b06332896 | ||
|
|
c268663bd9 | ||
|
|
a97fd1dede | ||
|
|
b93284530e | ||
|
|
cf4c263a4e | ||
|
|
23777143b6 | ||
|
|
9b397f00be | ||
|
|
d47c16e196 | ||
|
|
e0c7b5c63c | ||
|
|
091c8ab2dd | ||
|
|
86303bde55 | ||
|
|
c89f119e1a | ||
|
|
25369ca591 | ||
|
|
a399b65033 | ||
|
|
ed37ae1562 | ||
|
|
5381d4d5be | ||
|
|
c1825b2651 | ||
|
|
e7d448c56c | ||
|
|
694b5bb5c0 | ||
|
|
eb498e6c03 | ||
|
|
ca8b589d43 | ||
|
|
18706f7fad | ||
|
|
80f3b9a711 | ||
|
|
6b3f01bfeb | ||
|
|
42042fb5de | ||
|
|
2abc7fc588 | ||
|
|
1ecc326714 | ||
|
|
d2d829abf5 | ||
|
|
43d9ac2bd4 | ||
|
|
d8196cf7e6 | ||
|
|
42b0edca6d | ||
|
|
331ccc5549 | ||
|
|
d5627fdf1b | ||
|
|
7b3a17bfe7 | ||
|
|
4a8f01c9dc | ||
|
|
13bf3e649a | ||
|
|
9a63fb1055 | ||
|
|
3544793961 | ||
|
|
7a8add0412 | ||
|
|
1d382bcb4d | ||
|
|
ec6ad3ce68 | ||
|
|
73d8952f2a | ||
|
|
2a810fb796 |
@@ -64,5 +64,6 @@ Translations
|
|||||||
* [Japanese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ja-JP.md)
|
* [Japanese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ja-JP.md)
|
||||||
* [Polish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pl-PL.md)
|
* [Polish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pl-PL.md)
|
||||||
* [Portuguese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pt-BR.md)
|
* [Portuguese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pt-BR.md)
|
||||||
|
* [Russian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ru-RUS.md)
|
||||||
* [Spanish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-es-MX.md)
|
* [Spanish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-es-MX.md)
|
||||||
* [Turkish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-tr-TR.md)
|
* [Turkish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-tr-TR.md)
|
||||||
|
|||||||
50
doc/translations/README-ru-RUS.md
Normal file
50
doc/translations/README-ru-RUS.md
Normal file
@@ -0,0 +1,50 @@
|
|||||||
|
# sqlmap
|
||||||
|
|
||||||
|
[](https://api.travis-ci.org/sqlmapproject/sqlmap) [](https://www.python.org/) [](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [](https://twitter.com/sqlmap)
|
||||||
|
|
||||||
|
sqlmap - это инструмент для тестирования уязвимостей с открытым исходным кодом, который автоматизирует процесс обнаружения и использования ошибок SQL-инъекций и захвата серверов баз данных. Он оснащен мощным механизмом обнаружения, множеством приятных функций для профессионального тестера уязвимостей и широким спектром скриптов, которые упрощают работу с базами данных, от сбора данных из базы данных, до доступа к базовой файловой системе и выполнения команд в операционной системе через out-of-band соединение.
|
||||||
|
|
||||||
|
Скриншоты
|
||||||
|
----
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
Вы можете посетить [набор скриншотов](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) демонстрируемые некоторые функции в wiki.
|
||||||
|
|
||||||
|
Установка
|
||||||
|
----
|
||||||
|
|
||||||
|
Вы можете скачать последнюю версию tarball, нажав [сюда](https://github.com/sqlmapproject/sqlmap/tarball/master) или последний zipball, нажав [сюда](https://github.com/sqlmapproject/sqlmap/zipball/master).
|
||||||
|
|
||||||
|
Предпочтительно вы можете загрузить sqlmap, клонируя [Git](https://github.com/sqlmapproject/sqlmap) репозиторий:
|
||||||
|
|
||||||
|
git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev
|
||||||
|
|
||||||
|
sqlmap работает из коробки с [Python](http://www.python.org/download/) версии **2.6.x** и **2.7.x** на любой платформе.
|
||||||
|
|
||||||
|
Использование
|
||||||
|
----
|
||||||
|
|
||||||
|
Чтобы получить список основных опций и вариантов выбора, используйте:
|
||||||
|
|
||||||
|
python sqlmap.py -h
|
||||||
|
|
||||||
|
Чтобы получить список всех опций и вариантов выбора, используйте:
|
||||||
|
|
||||||
|
python sqlmap.py -hh
|
||||||
|
|
||||||
|
Вы можете найти пробный запуск [тут](https://asciinema.org/a/46601).
|
||||||
|
Чтобы получить обзор возможностей sqlmap, список поддерживаемых функций и описание всех параметров и переключателей, а также примеры, вам рекомендуется ознакомится с [пользовательским мануалом](https://github.com/sqlmapproject/sqlmap/wiki/Usage).
|
||||||
|
|
||||||
|
Ссылки
|
||||||
|
----
|
||||||
|
|
||||||
|
* Основной сайт: http://sqlmap.org
|
||||||
|
* Скачивание: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) или [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master)
|
||||||
|
* Канал новостей RSS: https://github.com/sqlmapproject/sqlmap/commits/master.atom
|
||||||
|
* Отслеживание проблем: https://github.com/sqlmapproject/sqlmap/issues
|
||||||
|
* Пользовательский мануал: https://github.com/sqlmapproject/sqlmap/wiki
|
||||||
|
* Часто задаваемые вопросы (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ
|
||||||
|
* Twitter: [@sqlmap](https://twitter.com/sqlmap)
|
||||||
|
* Демки: [http://www.youtube.com/user/inquisb/videos](http://www.youtube.com/user/inquisb/videos)
|
||||||
|
* Скриншоты: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots
|
||||||
@@ -80,7 +80,7 @@ def main(src, dst):
|
|||||||
cmd = ''
|
cmd = ''
|
||||||
|
|
||||||
# Wait for incoming replies
|
# Wait for incoming replies
|
||||||
if sock in select.select([ sock ], [], [])[0]:
|
if sock in select.select([sock], [], [])[0]:
|
||||||
buff = sock.recv(4096)
|
buff = sock.recv(4096)
|
||||||
|
|
||||||
if 0 == len(buff):
|
if 0 == len(buff):
|
||||||
|
|||||||
@@ -43,7 +43,7 @@ def updateMSSQLXML():
|
|||||||
|
|
||||||
return
|
return
|
||||||
|
|
||||||
releases = re.findall("class=\"BCC_DV_01DarkBlueTitle\">SQL Server\s(.+?)\sBuilds", mssqlVersionsHtmlString, re.I)
|
releases = re.findall(r"class=\"BCC_DV_01DarkBlueTitle\">SQL Server\s(.+?)\sBuilds", mssqlVersionsHtmlString, re.I)
|
||||||
releasesCount = len(releases)
|
releasesCount = len(releases)
|
||||||
|
|
||||||
# Create the minidom document
|
# Create the minidom document
|
||||||
@@ -74,7 +74,7 @@ def updateMSSQLXML():
|
|||||||
stopIdx = mssqlVersionsHtmlString.index("SQL Server %s Builds" % releases[index + 1])
|
stopIdx = mssqlVersionsHtmlString.index("SQL Server %s Builds" % releases[index + 1])
|
||||||
|
|
||||||
mssqlVersionsReleaseString = mssqlVersionsHtmlString[startIdx:stopIdx]
|
mssqlVersionsReleaseString = mssqlVersionsHtmlString[startIdx:stopIdx]
|
||||||
servicepackVersion = re.findall("</td><td>(7\.0|2000|2005|2008|2008 R2)*(.*?)</td><td.*?([\d\.]+)</td>[\r]*\n", mssqlVersionsReleaseString, re.I)
|
servicepackVersion = re.findall(r"</td><td>(7\.0|2000|2005|2008|2008 R2)*(.*?)</td><td.*?([\d\.]+)</td>[\r]*\n", mssqlVersionsReleaseString, re.I)
|
||||||
|
|
||||||
for servicePack, version in servicepackVersion:
|
for servicePack, version in servicepackVersion:
|
||||||
if servicePack.startswith(" "):
|
if servicePack.startswith(" "):
|
||||||
|
|||||||
30
extra/shutils/newlines.py
Normal file
30
extra/shutils/newlines.py
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
#! /usr/bin/env python
|
||||||
|
|
||||||
|
# Runs pylint on all python scripts found in a directory tree
|
||||||
|
# Reference: http://rowinggolfer.blogspot.com/2009/08/pylint-recursively.html
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
def check(filepath):
|
||||||
|
if filepath.endswith(".py"):
|
||||||
|
content = open(filepath, "rb").read()
|
||||||
|
|
||||||
|
if "\n\n\n" in content:
|
||||||
|
index = content.find("\n\n\n")
|
||||||
|
print filepath, repr(content[index - 30:index + 30])
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
try:
|
||||||
|
BASE_DIRECTORY = sys.argv[1]
|
||||||
|
except IndexError:
|
||||||
|
print "no directory specified, defaulting to current working directory"
|
||||||
|
BASE_DIRECTORY = os.getcwd()
|
||||||
|
|
||||||
|
print "looking for *.py scripts in subdirectories of ", BASE_DIRECTORY
|
||||||
|
for root, dirs, files in os.walk(BASE_DIRECTORY):
|
||||||
|
if any(_ in root for _ in ("extra", "thirdparty")):
|
||||||
|
continue
|
||||||
|
for name in files:
|
||||||
|
filepath = os.path.join(root, name)
|
||||||
|
check(filepath)
|
||||||
@@ -1,7 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
|
|
||||||
# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/)
|
|
||||||
# See the file 'LICENSE' for copying permission
|
|
||||||
|
|
||||||
# Runs pep8 on all python files (prerequisite: apt-get install pep8)
|
|
||||||
find . -wholename "./thirdparty" -prune -o -type f -iname "*.py" -exec pep8 '{}' \;
|
|
||||||
7
extra/shutils/pycodestyle.sh
Executable file
7
extra/shutils/pycodestyle.sh
Executable file
@@ -0,0 +1,7 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
||||||
|
# See the file 'LICENSE' for copying permission
|
||||||
|
|
||||||
|
# Runs pycodestyle on all python files (prerequisite: pip install pycodestyle)
|
||||||
|
find . -wholename "./thirdparty" -prune -o -type f -iname "*.py" -exec pycodestyle --ignore=E501,E302,E305,E722,E402 '{}' \;
|
||||||
@@ -27,7 +27,7 @@ SMTP_SERVER = "127.0.0.1"
|
|||||||
SMTP_PORT = 25
|
SMTP_PORT = 25
|
||||||
SMTP_TIMEOUT = 30
|
SMTP_TIMEOUT = 30
|
||||||
FROM = "regressiontest@sqlmap.org"
|
FROM = "regressiontest@sqlmap.org"
|
||||||
#TO = "dev@sqlmap.org"
|
# TO = "dev@sqlmap.org"
|
||||||
TO = ["bernardo.damele@gmail.com", "miroslav.stampar@gmail.com"]
|
TO = ["bernardo.damele@gmail.com", "miroslav.stampar@gmail.com"]
|
||||||
SUBJECT = "regression test started on %s using revision %s" % (START_TIME, getRevisionNumber())
|
SUBJECT = "regression test started on %s using revision %s" % (START_TIME, getRevisionNumber())
|
||||||
TARGET = "debian"
|
TARGET = "debian"
|
||||||
@@ -83,7 +83,7 @@ def main():
|
|||||||
if stderr:
|
if stderr:
|
||||||
failure_email("Execution of regression test failed with error:\n\n%s" % stderr)
|
failure_email("Execution of regression test failed with error:\n\n%s" % stderr)
|
||||||
|
|
||||||
failed_tests = re.findall("running live test case: (.+?) \((\d+)\/\d+\)[\r]*\n.+test failed (at parsing items: (.+))?\s*\- scan folder: (\/.+) \- traceback: (.*?)( - SQL injection not detected)?[\r]*\n", stdout)
|
failed_tests = re.findall(r"running live test case: (.+?) \((\d+)\/\d+\)[\r]*\n.+test failed (at parsing items: (.+))?\s*\- scan folder: (\/.+) \- traceback: (.*?)( - SQL injection not detected)?[\r]*\n", stdout)
|
||||||
|
|
||||||
for failed_test in failed_tests:
|
for failed_test in failed_tests:
|
||||||
title = failed_test[0]
|
title = failed_test[0]
|
||||||
|
|||||||
@@ -433,7 +433,7 @@ def checkSqlInjection(place, parameter, value):
|
|||||||
|
|
||||||
if conf.invalidLogical:
|
if conf.invalidLogical:
|
||||||
_ = int(kb.data.randomInt[:2])
|
_ = int(kb.data.randomInt[:2])
|
||||||
origValue = "%s AND %s=%s" % (value, _, _ + 1)
|
origValue = "%s AND %s LIKE %s" % (value, _, _ + 1)
|
||||||
elif conf.invalidBignum:
|
elif conf.invalidBignum:
|
||||||
origValue = kb.data.randomInt[:6]
|
origValue = kb.data.randomInt[:6]
|
||||||
elif conf.invalidString:
|
elif conf.invalidString:
|
||||||
@@ -558,14 +558,14 @@ def checkSqlInjection(place, parameter, value):
|
|||||||
logger.info(infoMsg)
|
logger.info(infoMsg)
|
||||||
else:
|
else:
|
||||||
trueSet = set(extractTextTagContent(trueRawResponse))
|
trueSet = set(extractTextTagContent(trueRawResponse))
|
||||||
trueSet = trueSet.union(__ for _ in trueSet for __ in _.split())
|
trueSet |= set(__ for _ in trueSet for __ in _.split())
|
||||||
|
|
||||||
falseSet = set(extractTextTagContent(falseRawResponse))
|
falseSet = set(extractTextTagContent(falseRawResponse))
|
||||||
falseSet = falseSet.union(__ for _ in falseSet for __ in _.split())
|
falseSet |= set(__ for _ in falseSet for __ in _.split())
|
||||||
|
|
||||||
if threadData.lastErrorPage and threadData.lastErrorPage[1]:
|
if threadData.lastErrorPage and threadData.lastErrorPage[1]:
|
||||||
errorSet = set(extractTextTagContent(threadData.lastErrorPage[1]))
|
errorSet = set(extractTextTagContent(threadData.lastErrorPage[1]))
|
||||||
errorSet = errorSet.union(__ for _ in errorSet for __ in _.split())
|
errorSet |= set(__ for _ in errorSet for __ in _.split())
|
||||||
else:
|
else:
|
||||||
errorSet = set()
|
errorSet = set()
|
||||||
|
|
||||||
@@ -783,7 +783,7 @@ def checkSqlInjection(place, parameter, value):
|
|||||||
|
|
||||||
if conf.multipleTargets:
|
if conf.multipleTargets:
|
||||||
msg = "how do you want to proceed? [ne(X)t target/(s)kip current test/(e)nd detection phase/(n)ext parameter/(c)hange verbosity/(q)uit]"
|
msg = "how do you want to proceed? [ne(X)t target/(s)kip current test/(e)nd detection phase/(n)ext parameter/(c)hange verbosity/(q)uit]"
|
||||||
choice = readInput(msg, default='T', checkBatch=False).upper()
|
choice = readInput(msg, default='X', checkBatch=False).upper()
|
||||||
else:
|
else:
|
||||||
msg = "how do you want to proceed? [(S)kip current test/(e)nd detection phase/(n)ext parameter/(c)hange verbosity/(q)uit]"
|
msg = "how do you want to proceed? [(S)kip current test/(e)nd detection phase/(n)ext parameter/(c)hange verbosity/(q)uit]"
|
||||||
choice = readInput(msg, default='S', checkBatch=False).upper()
|
choice = readInput(msg, default='S', checkBatch=False).upper()
|
||||||
@@ -1562,7 +1562,6 @@ def checkConnection(suppressOutput=False):
|
|||||||
else:
|
else:
|
||||||
kb.errorIsNone = True
|
kb.errorIsNone = True
|
||||||
|
|
||||||
|
|
||||||
threadData = getCurrentThreadData()
|
threadData = getCurrentThreadData()
|
||||||
|
|
||||||
if kb.redirectChoice == REDIRECTION.YES and threadData.lastRedirectURL and threadData.lastRedirectURL[0] == threadData.lastRequestUID:
|
if kb.redirectChoice == REDIRECTION.YES and threadData.lastRedirectURL and threadData.lastRedirectURL[0] == threadData.lastRequestUID:
|
||||||
|
|||||||
@@ -142,7 +142,7 @@ class Agent(object):
|
|||||||
match = re.search(r"\A[^ ]+", newValue)
|
match = re.search(r"\A[^ ]+", newValue)
|
||||||
newValue = newValue[len(match.group() if match else ""):]
|
newValue = newValue[len(match.group() if match else ""):]
|
||||||
_ = randomInt(2)
|
_ = randomInt(2)
|
||||||
value = "%s%s AND %s=%s" % (origValue, match.group() if match else "", _, _ + 1)
|
value = "%s%s AND %s LIKE %s" % (origValue, match.group() if match else "", _, _ + 1)
|
||||||
elif conf.invalidBignum:
|
elif conf.invalidBignum:
|
||||||
value = randomInt(6)
|
value = randomInt(6)
|
||||||
elif conf.invalidString:
|
elif conf.invalidString:
|
||||||
@@ -198,7 +198,7 @@ class Agent(object):
|
|||||||
regex = r"(\A|\b)%s=%s%s" % (re.escape(parameter), re.escape(origValue), r"(\Z|\b)" if origValue[-1].isalnum() else "")
|
regex = r"(\A|\b)%s=%s%s" % (re.escape(parameter), re.escape(origValue), r"(\Z|\b)" if origValue[-1].isalnum() else "")
|
||||||
retVal = _(regex, "%s=%s" % (parameter, self.addPayloadDelimiters(newValue)), paramString)
|
retVal = _(regex, "%s=%s" % (parameter, self.addPayloadDelimiters(newValue)), paramString)
|
||||||
else:
|
else:
|
||||||
retVal = _(r"(\A|\b)%s=%s(\Z|%s|%s|\s)" % (re.escape(parameter), re.escape(origValue), DEFAULT_GET_POST_DELIMITER, DEFAULT_COOKIE_DELIMITER), "%s=%s\g<2>" % (parameter, self.addPayloadDelimiters(newValue)), paramString)
|
retVal = _(r"(\A|\b)%s=%s(\Z|%s|%s|\s)" % (re.escape(parameter), re.escape(origValue), DEFAULT_GET_POST_DELIMITER, DEFAULT_COOKIE_DELIMITER), r"%s=%s\g<2>" % (parameter, self.addPayloadDelimiters(newValue)), paramString)
|
||||||
|
|
||||||
if retVal == paramString and urlencode(parameter) != parameter:
|
if retVal == paramString and urlencode(parameter) != parameter:
|
||||||
retVal = _(r"(\A|\b)%s=%s" % (re.escape(urlencode(parameter)), re.escape(origValue)), "%s=%s" % (urlencode(parameter), self.addPayloadDelimiters(newValue)), paramString)
|
retVal = _(r"(\A|\b)%s=%s" % (re.escape(urlencode(parameter)), re.escape(origValue)), "%s=%s" % (urlencode(parameter), self.addPayloadDelimiters(newValue)), paramString)
|
||||||
@@ -535,7 +535,7 @@ class Agent(object):
|
|||||||
fieldsToCastStr = fieldsToCastStr or ""
|
fieldsToCastStr = fieldsToCastStr or ""
|
||||||
|
|
||||||
# Function
|
# Function
|
||||||
if re.search("\A\w+\(.*\)", fieldsToCastStr, re.I) or (fieldsSelectCase and "WHEN use" not in query) or fieldsSubstr:
|
if re.search(r"\A\w+\(.*\)", fieldsToCastStr, re.I) or (fieldsSelectCase and "WHEN use" not in query) or fieldsSubstr:
|
||||||
fieldsToCastList = [fieldsToCastStr]
|
fieldsToCastList = [fieldsToCastStr]
|
||||||
else:
|
else:
|
||||||
fieldsToCastList = splitFields(fieldsToCastStr)
|
fieldsToCastList = splitFields(fieldsToCastStr)
|
||||||
@@ -627,7 +627,7 @@ class Agent(object):
|
|||||||
concatenatedQuery = concatenatedQuery.replace("SELECT ", "'%s'||" % kb.chars.start, 1)
|
concatenatedQuery = concatenatedQuery.replace("SELECT ", "'%s'||" % kb.chars.start, 1)
|
||||||
_ = unArrayizeValue(zeroDepthSearch(concatenatedQuery, " FROM "))
|
_ = unArrayizeValue(zeroDepthSearch(concatenatedQuery, " FROM "))
|
||||||
concatenatedQuery = "%s||'%s'%s" % (concatenatedQuery[:_], kb.chars.stop, concatenatedQuery[_:])
|
concatenatedQuery = "%s||'%s'%s" % (concatenatedQuery[:_], kb.chars.stop, concatenatedQuery[_:])
|
||||||
concatenatedQuery = re.sub(r"('%s'\|\|)(.+)(%s)" % (kb.chars.start, re.escape(castedFields)), "\g<2>\g<1>\g<3>", concatenatedQuery)
|
concatenatedQuery = re.sub(r"('%s'\|\|)(.+)(%s)" % (kb.chars.start, re.escape(castedFields)), r"\g<2>\g<1>\g<3>", concatenatedQuery)
|
||||||
elif fieldsSelect:
|
elif fieldsSelect:
|
||||||
concatenatedQuery = concatenatedQuery.replace("SELECT ", "'%s'||" % kb.chars.start, 1)
|
concatenatedQuery = concatenatedQuery.replace("SELECT ", "'%s'||" % kb.chars.start, 1)
|
||||||
concatenatedQuery += "||'%s'" % kb.chars.stop
|
concatenatedQuery += "||'%s'" % kb.chars.stop
|
||||||
@@ -639,7 +639,7 @@ class Agent(object):
|
|||||||
concatenatedQuery = concatenatedQuery.replace("SELECT ", "'%s'+" % kb.chars.start, 1)
|
concatenatedQuery = concatenatedQuery.replace("SELECT ", "'%s'+" % kb.chars.start, 1)
|
||||||
concatenatedQuery += "+'%s'" % kb.chars.stop
|
concatenatedQuery += "+'%s'" % kb.chars.stop
|
||||||
elif fieldsSelectTop:
|
elif fieldsSelectTop:
|
||||||
topNum = re.search("\ASELECT\s+TOP\s+([\d]+)\s+", concatenatedQuery, re.I).group(1)
|
topNum = re.search(r"\ASELECT\s+TOP\s+([\d]+)\s+", concatenatedQuery, re.I).group(1)
|
||||||
concatenatedQuery = concatenatedQuery.replace("SELECT TOP %s " % topNum, "TOP %s '%s'+" % (topNum, kb.chars.start), 1)
|
concatenatedQuery = concatenatedQuery.replace("SELECT TOP %s " % topNum, "TOP %s '%s'+" % (topNum, kb.chars.start), 1)
|
||||||
concatenatedQuery = concatenatedQuery.replace(" FROM ", "+'%s' FROM " % kb.chars.stop, 1)
|
concatenatedQuery = concatenatedQuery.replace(" FROM ", "+'%s' FROM " % kb.chars.stop, 1)
|
||||||
elif fieldsSelectCase:
|
elif fieldsSelectCase:
|
||||||
|
|||||||
@@ -5,6 +5,7 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
|||||||
See the file 'LICENSE' for copying permission
|
See the file 'LICENSE' for copying permission
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
import binascii
|
||||||
import codecs
|
import codecs
|
||||||
import contextlib
|
import contextlib
|
||||||
import cookielib
|
import cookielib
|
||||||
@@ -101,7 +102,10 @@ from lib.core.settings import BOUNDED_INJECTION_MARKER
|
|||||||
from lib.core.settings import BRUTE_DOC_ROOT_PREFIXES
|
from lib.core.settings import BRUTE_DOC_ROOT_PREFIXES
|
||||||
from lib.core.settings import BRUTE_DOC_ROOT_SUFFIXES
|
from lib.core.settings import BRUTE_DOC_ROOT_SUFFIXES
|
||||||
from lib.core.settings import BRUTE_DOC_ROOT_TARGET_MARK
|
from lib.core.settings import BRUTE_DOC_ROOT_TARGET_MARK
|
||||||
|
from lib.core.settings import BURP_REQUEST_REGEX
|
||||||
|
from lib.core.settings import BURP_XML_HISTORY_REGEX
|
||||||
from lib.core.settings import DBMS_DIRECTORY_DICT
|
from lib.core.settings import DBMS_DIRECTORY_DICT
|
||||||
|
from lib.core.settings import CRAWL_EXCLUDE_EXTENSIONS
|
||||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||||
from lib.core.settings import DEFAULT_COOKIE_DELIMITER
|
from lib.core.settings import DEFAULT_COOKIE_DELIMITER
|
||||||
from lib.core.settings import DEFAULT_GET_POST_DELIMITER
|
from lib.core.settings import DEFAULT_GET_POST_DELIMITER
|
||||||
@@ -139,6 +143,7 @@ from lib.core.settings import PARTIAL_VALUE_MARKER
|
|||||||
from lib.core.settings import PAYLOAD_DELIMITER
|
from lib.core.settings import PAYLOAD_DELIMITER
|
||||||
from lib.core.settings import PLATFORM
|
from lib.core.settings import PLATFORM
|
||||||
from lib.core.settings import PRINTABLE_CHAR_REGEX
|
from lib.core.settings import PRINTABLE_CHAR_REGEX
|
||||||
|
from lib.core.settings import PROBLEMATIC_CUSTOM_INJECTION_PATTERNS
|
||||||
from lib.core.settings import PUSH_VALUE_EXCEPTION_RETRY_COUNT
|
from lib.core.settings import PUSH_VALUE_EXCEPTION_RETRY_COUNT
|
||||||
from lib.core.settings import PYVERSION
|
from lib.core.settings import PYVERSION
|
||||||
from lib.core.settings import REFERER_ALIASES
|
from lib.core.settings import REFERER_ALIASES
|
||||||
@@ -161,6 +166,7 @@ from lib.core.settings import URLENCODE_CHAR_LIMIT
|
|||||||
from lib.core.settings import URLENCODE_FAILSAFE_CHARS
|
from lib.core.settings import URLENCODE_FAILSAFE_CHARS
|
||||||
from lib.core.settings import USER_AGENT_ALIASES
|
from lib.core.settings import USER_AGENT_ALIASES
|
||||||
from lib.core.settings import VERSION_STRING
|
from lib.core.settings import VERSION_STRING
|
||||||
|
from lib.core.settings import WEBSCARAB_SPLITTER
|
||||||
from lib.core.threads import getCurrentThreadData
|
from lib.core.threads import getCurrentThreadData
|
||||||
from lib.utils.sqlalchemy import _sqlalchemy
|
from lib.utils.sqlalchemy import _sqlalchemy
|
||||||
from thirdparty.clientform.clientform import ParseResponse
|
from thirdparty.clientform.clientform import ParseResponse
|
||||||
@@ -1025,7 +1031,7 @@ def readInput(message, default=None, checkBatch=True, boolean=False):
|
|||||||
logger.debug(debugMsg)
|
logger.debug(debugMsg)
|
||||||
|
|
||||||
if retVal is None:
|
if retVal is None:
|
||||||
if checkBatch and conf.get("batch"):
|
if checkBatch and conf.get("batch") or conf.get("api"):
|
||||||
if isListLike(default):
|
if isListLike(default):
|
||||||
options = ','.join(getUnicode(opt, UNICODE_ENCODING) for opt in default)
|
options = ','.join(getUnicode(opt, UNICODE_ENCODING) for opt in default)
|
||||||
elif default:
|
elif default:
|
||||||
@@ -1267,11 +1273,15 @@ def setPaths(rootPath):
|
|||||||
paths.SQLMAP_DUMP_PATH = os.path.join(paths.SQLMAP_OUTPUT_PATH, "%s", "dump")
|
paths.SQLMAP_DUMP_PATH = os.path.join(paths.SQLMAP_OUTPUT_PATH, "%s", "dump")
|
||||||
paths.SQLMAP_FILES_PATH = os.path.join(paths.SQLMAP_OUTPUT_PATH, "%s", "files")
|
paths.SQLMAP_FILES_PATH = os.path.join(paths.SQLMAP_OUTPUT_PATH, "%s", "files")
|
||||||
|
|
||||||
|
# history files
|
||||||
|
paths.SQLMAP_HISTORY_PATH = getUnicode(os.path.join(_, "history"), encoding=sys.getfilesystemencoding() or UNICODE_ENCODING)
|
||||||
|
paths.API_SHELL_HISTORY = os.path.join(paths.SQLMAP_HISTORY_PATH, "api.hst")
|
||||||
|
paths.OS_SHELL_HISTORY = os.path.join(paths.SQLMAP_HISTORY_PATH, "os.hst")
|
||||||
|
paths.SQL_SHELL_HISTORY = os.path.join(paths.SQLMAP_HISTORY_PATH, "sql.hst")
|
||||||
|
paths.SQLMAP_SHELL_HISTORY = os.path.join(paths.SQLMAP_HISTORY_PATH, "sqlmap.hst")
|
||||||
|
paths.GITHUB_HISTORY = os.path.join(paths.SQLMAP_HISTORY_PATH, "github.hst")
|
||||||
|
|
||||||
# sqlmap files
|
# sqlmap files
|
||||||
paths.OS_SHELL_HISTORY = os.path.join(_, "os.hst")
|
|
||||||
paths.SQL_SHELL_HISTORY = os.path.join(_, "sql.hst")
|
|
||||||
paths.SQLMAP_SHELL_HISTORY = os.path.join(_, "sqlmap.hst")
|
|
||||||
paths.GITHUB_HISTORY = os.path.join(_, "github.hst")
|
|
||||||
paths.CHECKSUM_MD5 = os.path.join(paths.SQLMAP_TXT_PATH, "checksum.md5")
|
paths.CHECKSUM_MD5 = os.path.join(paths.SQLMAP_TXT_PATH, "checksum.md5")
|
||||||
paths.COMMON_COLUMNS = os.path.join(paths.SQLMAP_TXT_PATH, "common-columns.txt")
|
paths.COMMON_COLUMNS = os.path.join(paths.SQLMAP_TXT_PATH, "common-columns.txt")
|
||||||
paths.COMMON_TABLES = os.path.join(paths.SQLMAP_TXT_PATH, "common-tables.txt")
|
paths.COMMON_TABLES = os.path.join(paths.SQLMAP_TXT_PATH, "common-tables.txt")
|
||||||
@@ -1317,7 +1327,7 @@ def parseTargetDirect():
|
|||||||
remote = False
|
remote = False
|
||||||
|
|
||||||
for dbms in SUPPORTED_DBMS:
|
for dbms in SUPPORTED_DBMS:
|
||||||
details = re.search("^(?P<dbms>%s)://(?P<credentials>(?P<user>.+?)\:(?P<pass>.*)\@)?(?P<remote>(?P<hostname>[\w.-]+?)\:(?P<port>[\d]+)\/)?(?P<db>[\w\d\ \:\.\_\-\/\\\\]+?)$" % dbms, conf.direct, re.I)
|
details = re.search(r"^(?P<dbms>%s)://(?P<credentials>(?P<user>.+?)\:(?P<pass>.*)\@)?(?P<remote>(?P<hostname>[\w.-]+?)\:(?P<port>[\d]+)\/)?(?P<db>[\w\d\ \:\.\_\-\/\\]+?)$" % dbms, conf.direct, re.I)
|
||||||
|
|
||||||
if details:
|
if details:
|
||||||
conf.dbms = details.group("dbms")
|
conf.dbms = details.group("dbms")
|
||||||
@@ -1387,6 +1397,10 @@ def parseTargetDirect():
|
|||||||
__import__("psycopg2")
|
__import__("psycopg2")
|
||||||
elif dbmsName == DBMS.ORACLE:
|
elif dbmsName == DBMS.ORACLE:
|
||||||
__import__("cx_Oracle")
|
__import__("cx_Oracle")
|
||||||
|
|
||||||
|
# Reference: http://itsiti.com/ora-28009-connection-sys-sysdba-sysoper
|
||||||
|
if (conf.dbmsUser or "").upper() == "SYS":
|
||||||
|
conf.direct = "%s?mode=SYSDBA" % conf.direct
|
||||||
elif dbmsName == DBMS.SQLITE:
|
elif dbmsName == DBMS.SQLITE:
|
||||||
__import__("sqlite3")
|
__import__("sqlite3")
|
||||||
elif dbmsName == DBMS.ACCESS:
|
elif dbmsName == DBMS.ACCESS:
|
||||||
@@ -1436,7 +1450,7 @@ def parseTargetUrl():
|
|||||||
errMsg += "in the hostname part"
|
errMsg += "in the hostname part"
|
||||||
raise SqlmapGenericException(errMsg)
|
raise SqlmapGenericException(errMsg)
|
||||||
|
|
||||||
hostnamePort = urlSplit.netloc.split(":") if not re.search(r"\[.+\]", urlSplit.netloc) else filter(None, (re.search("\[.+\]", urlSplit.netloc).group(0), re.search(r"\](:(?P<port>\d+))?", urlSplit.netloc).group("port")))
|
hostnamePort = urlSplit.netloc.split(":") if not re.search(r"\[.+\]", urlSplit.netloc) else filter(None, (re.search(r"\[.+\]", urlSplit.netloc).group(0), re.search(r"\](:(?P<port>\d+))?", urlSplit.netloc).group("port")))
|
||||||
|
|
||||||
conf.scheme = (urlSplit.scheme.strip().lower() or "http") if not conf.forceSSL else "https"
|
conf.scheme = (urlSplit.scheme.strip().lower() or "http") if not conf.forceSSL else "https"
|
||||||
conf.path = urlSplit.path.strip()
|
conf.path = urlSplit.path.strip()
|
||||||
@@ -1862,8 +1876,7 @@ def getFilteredPageContent(page, onlyText=True, split=" "):
|
|||||||
# only if the page's charset has been successfully identified
|
# only if the page's charset has been successfully identified
|
||||||
if isinstance(page, unicode):
|
if isinstance(page, unicode):
|
||||||
retVal = re.sub(r"(?si)<script.+?</script>|<!--.+?-->|<style.+?</style>%s" % (r"|<[^>]+>|\t|\n|\r" if onlyText else ""), split, page)
|
retVal = re.sub(r"(?si)<script.+?</script>|<!--.+?-->|<style.+?</style>%s" % (r"|<[^>]+>|\t|\n|\r" if onlyText else ""), split, page)
|
||||||
while retVal.find(2 * split) != -1:
|
retVal = re.sub(r"%s{2,}" % split, split, retVal)
|
||||||
retVal = retVal.replace(2 * split, split)
|
|
||||||
retVal = htmlunescape(retVal.strip().strip(split))
|
retVal = htmlunescape(retVal.strip().strip(split))
|
||||||
|
|
||||||
return retVal
|
return retVal
|
||||||
@@ -2156,7 +2169,7 @@ def initCommonOutputs():
|
|||||||
if line not in kb.commonOutputs[key]:
|
if line not in kb.commonOutputs[key]:
|
||||||
kb.commonOutputs[key].add(line)
|
kb.commonOutputs[key].add(line)
|
||||||
|
|
||||||
def getFileItems(filename, commentPrefix='#', unicode_=True, lowercase=False, unique=False):
|
def getFileItems(filename, commentPrefix='#', unicoded=True, lowercase=False, unique=False):
|
||||||
"""
|
"""
|
||||||
Returns newline delimited items contained inside file
|
Returns newline delimited items contained inside file
|
||||||
"""
|
"""
|
||||||
@@ -2169,20 +2182,14 @@ def getFileItems(filename, commentPrefix='#', unicode_=True, lowercase=False, un
|
|||||||
checkFile(filename)
|
checkFile(filename)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with openFile(filename, 'r', errors="ignore") if unicode_ else open(filename, 'r') as f:
|
with openFile(filename, 'r', errors="ignore") if unicoded else open(filename, 'r') as f:
|
||||||
for line in (f.readlines() if unicode_ else f.xreadlines()): # xreadlines doesn't return unicode strings when codec.open() is used
|
for line in (f.readlines() if unicoded else f.xreadlines()): # xreadlines doesn't return unicode strings when codec.open() is used
|
||||||
if commentPrefix:
|
if commentPrefix:
|
||||||
if line.find(commentPrefix) != -1:
|
if line.find(commentPrefix) != -1:
|
||||||
line = line[:line.find(commentPrefix)]
|
line = line[:line.find(commentPrefix)]
|
||||||
|
|
||||||
line = line.strip()
|
line = line.strip()
|
||||||
|
|
||||||
if not unicode_:
|
|
||||||
try:
|
|
||||||
line = str.encode(line)
|
|
||||||
except UnicodeDecodeError:
|
|
||||||
continue
|
|
||||||
|
|
||||||
if line:
|
if line:
|
||||||
if lowercase:
|
if lowercase:
|
||||||
line = line.lower()
|
line = line.lower()
|
||||||
@@ -3358,7 +3365,7 @@ def createGithubIssue(errMsg, excMsg):
|
|||||||
|
|
||||||
_ = re.sub(r"'[^']+'", "''", excMsg)
|
_ = re.sub(r"'[^']+'", "''", excMsg)
|
||||||
_ = re.sub(r"\s+line \d+", "", _)
|
_ = re.sub(r"\s+line \d+", "", _)
|
||||||
_ = re.sub(r'File ".+?/(\w+\.py)', "\g<1>", _)
|
_ = re.sub(r'File ".+?/(\w+\.py)', r"\g<1>", _)
|
||||||
_ = re.sub(r".+\Z", "", _)
|
_ = re.sub(r".+\Z", "", _)
|
||||||
key = hashlib.md5(_).hexdigest()[:8]
|
key = hashlib.md5(_).hexdigest()[:8]
|
||||||
|
|
||||||
@@ -3369,7 +3376,7 @@ def createGithubIssue(errMsg, excMsg):
|
|||||||
msg += "with the unhandled exception information at "
|
msg += "with the unhandled exception information at "
|
||||||
msg += "the official Github repository? [y/N] "
|
msg += "the official Github repository? [y/N] "
|
||||||
try:
|
try:
|
||||||
choice = readInput(msg, default='N', boolean=True)
|
choice = readInput(msg, default='N', checkBatch=False, boolean=True)
|
||||||
except:
|
except:
|
||||||
choice = None
|
choice = None
|
||||||
|
|
||||||
@@ -3436,10 +3443,10 @@ def maskSensitiveData(msg):
|
|||||||
value = extractRegexResult(regex, retVal)
|
value = extractRegexResult(regex, retVal)
|
||||||
retVal = retVal.replace(value, '*' * len(value))
|
retVal = retVal.replace(value, '*' * len(value))
|
||||||
|
|
||||||
if not conf.get("hostname"):
|
# Just in case (for problematic parameters regarding user encoding)
|
||||||
match = re.search(r"(?i)sqlmap.+(-u|--url)(\s+|=)([^ ]+)", retVal)
|
match = re.search(r"(?i)[ -]-(u|url|data|cookie)( |=)(.*?)( -?-[a-z]|\Z)", retVal)
|
||||||
if match:
|
if match:
|
||||||
retVal = retVal.replace(match.group(3), '*' * len(match.group(3)))
|
retVal = retVal.replace(match.group(3), '*' * len(match.group(3)))
|
||||||
|
|
||||||
if getpass.getuser():
|
if getpass.getuser():
|
||||||
retVal = re.sub(r"(?i)\b%s\b" % re.escape(getpass.getuser()), '*' * len(getpass.getuser()), retVal)
|
retVal = re.sub(r"(?i)\b%s\b" % re.escape(getpass.getuser()), '*' * len(getpass.getuser()), retVal)
|
||||||
@@ -3525,6 +3532,7 @@ def removeReflectiveValues(content, payload, suppressWarning=False):
|
|||||||
regex = r"%s\b" % regex
|
regex = r"%s\b" % regex
|
||||||
|
|
||||||
_retVal = [retVal]
|
_retVal = [retVal]
|
||||||
|
|
||||||
def _thread(regex):
|
def _thread(regex):
|
||||||
try:
|
try:
|
||||||
_retVal[0] = re.sub(r"(?i)%s" % regex, REFLECTED_VALUE_MARKER, _retVal[0])
|
_retVal[0] = re.sub(r"(?i)%s" % regex, REFLECTED_VALUE_MARKER, _retVal[0])
|
||||||
@@ -3960,6 +3968,7 @@ def findPageForms(content, url, raise_=False, addToTargets=False):
|
|||||||
def __init__(self, content, url):
|
def __init__(self, content, url):
|
||||||
StringIO.__init__(self, unicodeencode(content, kb.pageEncoding) if isinstance(content, unicode) else content)
|
StringIO.__init__(self, unicodeencode(content, kb.pageEncoding) if isinstance(content, unicode) else content)
|
||||||
self._url = url
|
self._url = url
|
||||||
|
|
||||||
def geturl(self):
|
def geturl(self):
|
||||||
return self._url
|
return self._url
|
||||||
|
|
||||||
@@ -4085,7 +4094,7 @@ def getHostHeader(url):
|
|||||||
retVal = urlparse.urlparse(url).netloc
|
retVal = urlparse.urlparse(url).netloc
|
||||||
|
|
||||||
if re.search(r"http(s)?://\[.+\]", url, re.I):
|
if re.search(r"http(s)?://\[.+\]", url, re.I):
|
||||||
retVal = extractRegexResult("http(s)?://\[(?P<result>.+)\]", url)
|
retVal = extractRegexResult(r"http(s)?://\[(?P<result>.+)\]", url)
|
||||||
elif any(retVal.endswith(':%d' % _) for _ in (80, 443)):
|
elif any(retVal.endswith(':%d' % _) for _ in (80, 443)):
|
||||||
retVal = retVal.split(':')[0]
|
retVal = retVal.split(':')[0]
|
||||||
|
|
||||||
@@ -4097,6 +4106,7 @@ def checkDeprecatedOptions(args):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
for _ in args:
|
for _ in args:
|
||||||
|
_ = _.split('=')[0].strip()
|
||||||
if _ in DEPRECATED_OPTIONS:
|
if _ in DEPRECATED_OPTIONS:
|
||||||
errMsg = "switch/option '%s' is deprecated" % _
|
errMsg = "switch/option '%s' is deprecated" % _
|
||||||
if DEPRECATED_OPTIONS[_]:
|
if DEPRECATED_OPTIONS[_]:
|
||||||
@@ -4282,7 +4292,7 @@ def hashDBWrite(key, value, serialize=False):
|
|||||||
Helper function for writing session data to HashDB
|
Helper function for writing session data to HashDB
|
||||||
"""
|
"""
|
||||||
|
|
||||||
_ = "%s%s%s" % (conf.url or "%s%s" % (conf.hostname, conf.port), key, HASHDB_MILESTONE_VALUE)
|
_ = '|'.join((str(_) if not isinstance(_, basestring) else _) for _ in (conf.hostname, conf.path.strip('/') if conf.path is not None else conf.port, key, HASHDB_MILESTONE_VALUE))
|
||||||
conf.hashDB.write(_, value, serialize)
|
conf.hashDB.write(_, value, serialize)
|
||||||
|
|
||||||
def hashDBRetrieve(key, unserialize=False, checkConf=False):
|
def hashDBRetrieve(key, unserialize=False, checkConf=False):
|
||||||
@@ -4290,7 +4300,7 @@ def hashDBRetrieve(key, unserialize=False, checkConf=False):
|
|||||||
Helper function for restoring session data from HashDB
|
Helper function for restoring session data from HashDB
|
||||||
"""
|
"""
|
||||||
|
|
||||||
_ = "%s%s%s" % (conf.url or "%s%s" % (conf.hostname, conf.port), key, HASHDB_MILESTONE_VALUE)
|
_ = '|'.join((str(_) if not isinstance(_, basestring) else _) for _ in (conf.hostname, conf.path.strip('/') if conf.path is not None else conf.port, key, HASHDB_MILESTONE_VALUE))
|
||||||
retVal = conf.hashDB.retrieve(_, unserialize) if kb.resumeValues and not (checkConf and any((conf.flushSession, conf.freshQueries))) else None
|
retVal = conf.hashDB.retrieve(_, unserialize) if kb.resumeValues and not (checkConf and any((conf.flushSession, conf.freshQueries))) else None
|
||||||
|
|
||||||
if not kb.inferenceMode and not kb.fileReadMode and isinstance(retVal, basestring) and any(_ in retVal for _ in (PARTIAL_VALUE_MARKER, PARTIAL_HEX_VALUE_MARKER)):
|
if not kb.inferenceMode and not kb.fileReadMode and isinstance(retVal, basestring) and any(_ in retVal for _ in (PARTIAL_VALUE_MARKER, PARTIAL_HEX_VALUE_MARKER)):
|
||||||
@@ -4342,7 +4352,7 @@ def resetCookieJar(cookieJar):
|
|||||||
|
|
||||||
except cookielib.LoadError, msg:
|
except cookielib.LoadError, msg:
|
||||||
errMsg = "there was a problem loading "
|
errMsg = "there was a problem loading "
|
||||||
errMsg += "cookies file ('%s')" % re.sub(r"(cookies) file '[^']+'", "\g<1>", str(msg))
|
errMsg += "cookies file ('%s')" % re.sub(r"(cookies) file '[^']+'", r"\g<1>", str(msg))
|
||||||
raise SqlmapGenericException(errMsg)
|
raise SqlmapGenericException(errMsg)
|
||||||
|
|
||||||
def decloakToTemp(filename):
|
def decloakToTemp(filename):
|
||||||
@@ -4388,7 +4398,7 @@ def getRequestHeader(request, name):
|
|||||||
|
|
||||||
retVal = None
|
retVal = None
|
||||||
|
|
||||||
if request and name:
|
if request and request.headers and name:
|
||||||
_ = name.upper()
|
_ = name.upper()
|
||||||
retVal = max(value if _ == key.upper() else None for key, value in request.header_items())
|
retVal = max(value if _ == key.upper() else None for key, value in request.header_items())
|
||||||
|
|
||||||
@@ -4469,6 +4479,195 @@ def pollProcess(process, suppress_errors=False):
|
|||||||
|
|
||||||
break
|
break
|
||||||
|
|
||||||
|
def parseRequestFile(reqFile, checkParams=True):
|
||||||
|
"""
|
||||||
|
Parses WebScarab and Burp logs and adds results to the target URL list
|
||||||
|
"""
|
||||||
|
|
||||||
|
def _parseWebScarabLog(content):
|
||||||
|
"""
|
||||||
|
Parses WebScarab logs (POST method not supported)
|
||||||
|
"""
|
||||||
|
|
||||||
|
reqResList = content.split(WEBSCARAB_SPLITTER)
|
||||||
|
|
||||||
|
for request in reqResList:
|
||||||
|
url = extractRegexResult(r"URL: (?P<result>.+?)\n", request, re.I)
|
||||||
|
method = extractRegexResult(r"METHOD: (?P<result>.+?)\n", request, re.I)
|
||||||
|
cookie = extractRegexResult(r"COOKIE: (?P<result>.+?)\n", request, re.I)
|
||||||
|
|
||||||
|
if not method or not url:
|
||||||
|
logger.debug("not a valid WebScarab log data")
|
||||||
|
continue
|
||||||
|
|
||||||
|
if method.upper() == HTTPMETHOD.POST:
|
||||||
|
warnMsg = "POST requests from WebScarab logs aren't supported "
|
||||||
|
warnMsg += "as their body content is stored in separate files. "
|
||||||
|
warnMsg += "Nevertheless you can use -r to load them individually."
|
||||||
|
logger.warning(warnMsg)
|
||||||
|
continue
|
||||||
|
|
||||||
|
if not(conf.scope and not re.search(conf.scope, url, re.I)):
|
||||||
|
yield (url, method, None, cookie, tuple())
|
||||||
|
|
||||||
|
def _parseBurpLog(content):
|
||||||
|
"""
|
||||||
|
Parses Burp logs
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not re.search(BURP_REQUEST_REGEX, content, re.I | re.S):
|
||||||
|
if re.search(BURP_XML_HISTORY_REGEX, content, re.I | re.S):
|
||||||
|
reqResList = []
|
||||||
|
for match in re.finditer(BURP_XML_HISTORY_REGEX, content, re.I | re.S):
|
||||||
|
port, request = match.groups()
|
||||||
|
try:
|
||||||
|
request = request.decode("base64")
|
||||||
|
except binascii.Error:
|
||||||
|
continue
|
||||||
|
_ = re.search(r"%s:.+" % re.escape(HTTP_HEADER.HOST), request)
|
||||||
|
if _:
|
||||||
|
host = _.group(0).strip()
|
||||||
|
if not re.search(r":\d+\Z", host):
|
||||||
|
request = request.replace(host, "%s:%d" % (host, int(port)))
|
||||||
|
reqResList.append(request)
|
||||||
|
else:
|
||||||
|
reqResList = [content]
|
||||||
|
else:
|
||||||
|
reqResList = re.finditer(BURP_REQUEST_REGEX, content, re.I | re.S)
|
||||||
|
|
||||||
|
for match in reqResList:
|
||||||
|
request = match if isinstance(match, basestring) else match.group(0)
|
||||||
|
request = re.sub(r"\A[^\w]+", "", request)
|
||||||
|
|
||||||
|
schemePort = re.search(r"(http[\w]*)\:\/\/.*?\:([\d]+).+?={10,}", request, re.I | re.S)
|
||||||
|
|
||||||
|
if schemePort:
|
||||||
|
scheme = schemePort.group(1)
|
||||||
|
port = schemePort.group(2)
|
||||||
|
request = re.sub(r"\n=+\Z", "", request.split(schemePort.group(0))[-1].lstrip())
|
||||||
|
else:
|
||||||
|
scheme, port = None, None
|
||||||
|
|
||||||
|
if not re.search(r"^[\n]*(%s).*?\sHTTP\/" % "|".join(getPublicTypeMembers(HTTPMETHOD, True)), request, re.I | re.M):
|
||||||
|
continue
|
||||||
|
|
||||||
|
if re.search(r"^[\n]*%s.*?\.(%s)\sHTTP\/" % (HTTPMETHOD.GET, "|".join(CRAWL_EXCLUDE_EXTENSIONS)), request, re.I | re.M):
|
||||||
|
continue
|
||||||
|
|
||||||
|
getPostReq = False
|
||||||
|
url = None
|
||||||
|
host = None
|
||||||
|
method = None
|
||||||
|
data = None
|
||||||
|
cookie = None
|
||||||
|
params = False
|
||||||
|
newline = None
|
||||||
|
lines = request.split('\n')
|
||||||
|
headers = []
|
||||||
|
|
||||||
|
for index in xrange(len(lines)):
|
||||||
|
line = lines[index]
|
||||||
|
|
||||||
|
if not line.strip() and index == len(lines) - 1:
|
||||||
|
break
|
||||||
|
|
||||||
|
newline = "\r\n" if line.endswith('\r') else '\n'
|
||||||
|
line = line.strip('\r')
|
||||||
|
match = re.search(r"\A(%s) (.+) HTTP/[\d.]+\Z" % "|".join(getPublicTypeMembers(HTTPMETHOD, True)), line) if not method else None
|
||||||
|
|
||||||
|
if len(line.strip()) == 0 and method and method != HTTPMETHOD.GET and data is None:
|
||||||
|
data = ""
|
||||||
|
params = True
|
||||||
|
|
||||||
|
elif match:
|
||||||
|
method = match.group(1)
|
||||||
|
url = match.group(2)
|
||||||
|
|
||||||
|
if any(_ in line for _ in ('?', '=', kb.customInjectionMark)):
|
||||||
|
params = True
|
||||||
|
|
||||||
|
getPostReq = True
|
||||||
|
|
||||||
|
# POST parameters
|
||||||
|
elif data is not None and params:
|
||||||
|
data += "%s%s" % (line, newline)
|
||||||
|
|
||||||
|
# GET parameters
|
||||||
|
elif "?" in line and "=" in line and ": " not in line:
|
||||||
|
params = True
|
||||||
|
|
||||||
|
# Headers
|
||||||
|
elif re.search(r"\A\S+:", line):
|
||||||
|
key, value = line.split(":", 1)
|
||||||
|
value = value.strip().replace("\r", "").replace("\n", "")
|
||||||
|
|
||||||
|
# Cookie and Host headers
|
||||||
|
if key.upper() == HTTP_HEADER.COOKIE.upper():
|
||||||
|
cookie = value
|
||||||
|
elif key.upper() == HTTP_HEADER.HOST.upper():
|
||||||
|
if '://' in value:
|
||||||
|
scheme, value = value.split('://')[:2]
|
||||||
|
splitValue = value.split(":")
|
||||||
|
host = splitValue[0]
|
||||||
|
|
||||||
|
if len(splitValue) > 1:
|
||||||
|
port = filterStringValue(splitValue[1], "[0-9]")
|
||||||
|
|
||||||
|
# Avoid to add a static content length header to
|
||||||
|
# headers and consider the following lines as
|
||||||
|
# POSTed data
|
||||||
|
if key.upper() == HTTP_HEADER.CONTENT_LENGTH.upper():
|
||||||
|
params = True
|
||||||
|
|
||||||
|
# Avoid proxy and connection type related headers
|
||||||
|
elif key not in (HTTP_HEADER.PROXY_CONNECTION, HTTP_HEADER.CONNECTION):
|
||||||
|
headers.append((getUnicode(key), getUnicode(value)))
|
||||||
|
|
||||||
|
if kb.customInjectionMark in re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value or ""):
|
||||||
|
params = True
|
||||||
|
|
||||||
|
data = data.rstrip("\r\n") if data else data
|
||||||
|
|
||||||
|
if getPostReq and (params or cookie or not checkParams):
|
||||||
|
if not port and isinstance(scheme, basestring) and scheme.lower() == "https":
|
||||||
|
port = "443"
|
||||||
|
elif not scheme and port == "443":
|
||||||
|
scheme = "https"
|
||||||
|
|
||||||
|
if conf.forceSSL:
|
||||||
|
scheme = "https"
|
||||||
|
port = port or "443"
|
||||||
|
|
||||||
|
if not host:
|
||||||
|
errMsg = "invalid format of a request file"
|
||||||
|
raise SqlmapSyntaxException(errMsg)
|
||||||
|
|
||||||
|
if not url.startswith("http"):
|
||||||
|
url = "%s://%s:%s%s" % (scheme or "http", host, port or "80", url)
|
||||||
|
scheme = None
|
||||||
|
port = None
|
||||||
|
|
||||||
|
if not(conf.scope and not re.search(conf.scope, url, re.I)):
|
||||||
|
yield (url, conf.method or method, data, cookie, tuple(headers))
|
||||||
|
|
||||||
|
checkFile(reqFile)
|
||||||
|
try:
|
||||||
|
with openFile(reqFile, "rb") as f:
|
||||||
|
content = f.read()
|
||||||
|
except (IOError, OSError, MemoryError), ex:
|
||||||
|
errMsg = "something went wrong while trying "
|
||||||
|
errMsg += "to read the content of file '%s' ('%s')" % (reqFile, getSafeExString(ex))
|
||||||
|
raise SqlmapSystemException(errMsg)
|
||||||
|
|
||||||
|
if conf.scope:
|
||||||
|
logger.info("using regular expression '%s' for filtering targets" % conf.scope)
|
||||||
|
|
||||||
|
for target in _parseBurpLog(content):
|
||||||
|
yield target
|
||||||
|
|
||||||
|
for target in _parseWebScarabLog(content):
|
||||||
|
yield target
|
||||||
|
|
||||||
def getSafeExString(ex, encoding=None):
|
def getSafeExString(ex, encoding=None):
|
||||||
"""
|
"""
|
||||||
Safe way how to get the proper exception represtation as a string
|
Safe way how to get the proper exception represtation as a string
|
||||||
|
|||||||
@@ -17,7 +17,7 @@ def cachedmethod(f, cache={}):
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
def _(*args, **kwargs):
|
def _(*args, **kwargs):
|
||||||
key = int(hashlib.md5("".join(str(_) for _ in (f, args, kwargs))).hexdigest()[:8], 16)
|
key = int(hashlib.md5("|".join(str(_) for _ in (f, args, kwargs))).hexdigest(), 16) & 0x7fffffffffffffff
|
||||||
if key not in cache:
|
if key not in cache:
|
||||||
cache[key] = f(*args, **kwargs)
|
cache[key] = f(*args, **kwargs)
|
||||||
|
|
||||||
@@ -38,4 +38,4 @@ def stackedmethod(f):
|
|||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
return _
|
return _
|
||||||
|
|||||||
@@ -5,6 +5,7 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
|||||||
See the file 'LICENSE' for copying permission
|
See the file 'LICENSE' for copying permission
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
from lib.core.enums import CONTENT_TYPE
|
||||||
from lib.core.enums import DBMS
|
from lib.core.enums import DBMS
|
||||||
from lib.core.enums import OS
|
from lib.core.enums import OS
|
||||||
from lib.core.enums import POST_HINT
|
from lib.core.enums import POST_HINT
|
||||||
@@ -279,6 +280,8 @@ DEPRECATED_OPTIONS = {
|
|||||||
"--binary": "use '--binary-fields' instead",
|
"--binary": "use '--binary-fields' instead",
|
||||||
"--auth-private": "use '--auth-file' instead",
|
"--auth-private": "use '--auth-file' instead",
|
||||||
"--ignore-401": "use '--ignore-code' instead",
|
"--ignore-401": "use '--ignore-code' instead",
|
||||||
|
"--second-order": "use '--second-url' instead",
|
||||||
|
"--purge-output": "use '--purge' instead",
|
||||||
"--check-payload": None,
|
"--check-payload": None,
|
||||||
"--check-waf": None,
|
"--check-waf": None,
|
||||||
"--pickled-options": "use '--api -c ...' instead",
|
"--pickled-options": "use '--api -c ...' instead",
|
||||||
@@ -293,3 +296,31 @@ DEFAULT_DOC_ROOTS = {
|
|||||||
OS.WINDOWS: ("C:/xampp/htdocs/", "C:/wamp/www/", "C:/Inetpub/wwwroot/"),
|
OS.WINDOWS: ("C:/xampp/htdocs/", "C:/wamp/www/", "C:/Inetpub/wwwroot/"),
|
||||||
OS.LINUX: ("/var/www/", "/var/www/html", "/usr/local/apache2/htdocs", "/var/www/nginx-default", "/srv/www") # Reference: https://wiki.apache.org/httpd/DistrosDefaultLayout
|
OS.LINUX: ("/var/www/", "/var/www/html", "/usr/local/apache2/htdocs", "/var/www/nginx-default", "/srv/www") # Reference: https://wiki.apache.org/httpd/DistrosDefaultLayout
|
||||||
}
|
}
|
||||||
|
|
||||||
|
PART_RUN_CONTENT_TYPES = {
|
||||||
|
"checkDbms": CONTENT_TYPE.TECHNIQUES,
|
||||||
|
"getFingerprint": CONTENT_TYPE.DBMS_FINGERPRINT,
|
||||||
|
"getBanner": CONTENT_TYPE.BANNER,
|
||||||
|
"getCurrentUser": CONTENT_TYPE.CURRENT_USER,
|
||||||
|
"getCurrentDb": CONTENT_TYPE.CURRENT_DB,
|
||||||
|
"getHostname": CONTENT_TYPE.HOSTNAME,
|
||||||
|
"isDba": CONTENT_TYPE.IS_DBA,
|
||||||
|
"getUsers": CONTENT_TYPE.USERS,
|
||||||
|
"getPasswordHashes": CONTENT_TYPE.PASSWORDS,
|
||||||
|
"getPrivileges": CONTENT_TYPE.PRIVILEGES,
|
||||||
|
"getRoles": CONTENT_TYPE.ROLES,
|
||||||
|
"getDbs": CONTENT_TYPE.DBS,
|
||||||
|
"getTables": CONTENT_TYPE.TABLES,
|
||||||
|
"getColumns": CONTENT_TYPE.COLUMNS,
|
||||||
|
"getSchema": CONTENT_TYPE.SCHEMA,
|
||||||
|
"getCount": CONTENT_TYPE.COUNT,
|
||||||
|
"dumpTable": CONTENT_TYPE.DUMP_TABLE,
|
||||||
|
"search": CONTENT_TYPE.SEARCH,
|
||||||
|
"sqlQuery": CONTENT_TYPE.SQL_QUERY,
|
||||||
|
"tableExists": CONTENT_TYPE.COMMON_TABLES,
|
||||||
|
"columnExists": CONTENT_TYPE.COMMON_COLUMNS,
|
||||||
|
"readFile": CONTENT_TYPE.FILE_READ,
|
||||||
|
"writeFile": CONTENT_TYPE.FILE_WRITE,
|
||||||
|
"osCmd": CONTENT_TYPE.OS_CMD,
|
||||||
|
"regRead": CONTENT_TYPE.REG_READ
|
||||||
|
}
|
||||||
|
|||||||
@@ -340,34 +340,6 @@ class CONTENT_TYPE:
|
|||||||
OS_CMD = 24
|
OS_CMD = 24
|
||||||
REG_READ = 25
|
REG_READ = 25
|
||||||
|
|
||||||
PART_RUN_CONTENT_TYPES = {
|
|
||||||
"checkDbms": CONTENT_TYPE.TECHNIQUES,
|
|
||||||
"getFingerprint": CONTENT_TYPE.DBMS_FINGERPRINT,
|
|
||||||
"getBanner": CONTENT_TYPE.BANNER,
|
|
||||||
"getCurrentUser": CONTENT_TYPE.CURRENT_USER,
|
|
||||||
"getCurrentDb": CONTENT_TYPE.CURRENT_DB,
|
|
||||||
"getHostname": CONTENT_TYPE.HOSTNAME,
|
|
||||||
"isDba": CONTENT_TYPE.IS_DBA,
|
|
||||||
"getUsers": CONTENT_TYPE.USERS,
|
|
||||||
"getPasswordHashes": CONTENT_TYPE.PASSWORDS,
|
|
||||||
"getPrivileges": CONTENT_TYPE.PRIVILEGES,
|
|
||||||
"getRoles": CONTENT_TYPE.ROLES,
|
|
||||||
"getDbs": CONTENT_TYPE.DBS,
|
|
||||||
"getTables": CONTENT_TYPE.TABLES,
|
|
||||||
"getColumns": CONTENT_TYPE.COLUMNS,
|
|
||||||
"getSchema": CONTENT_TYPE.SCHEMA,
|
|
||||||
"getCount": CONTENT_TYPE.COUNT,
|
|
||||||
"dumpTable": CONTENT_TYPE.DUMP_TABLE,
|
|
||||||
"search": CONTENT_TYPE.SEARCH,
|
|
||||||
"sqlQuery": CONTENT_TYPE.SQL_QUERY,
|
|
||||||
"tableExists": CONTENT_TYPE.COMMON_TABLES,
|
|
||||||
"columnExists": CONTENT_TYPE.COMMON_COLUMNS,
|
|
||||||
"readFile": CONTENT_TYPE.FILE_READ,
|
|
||||||
"writeFile": CONTENT_TYPE.FILE_WRITE,
|
|
||||||
"osCmd": CONTENT_TYPE.OS_CMD,
|
|
||||||
"regRead": CONTENT_TYPE.REG_READ
|
|
||||||
}
|
|
||||||
|
|
||||||
class CONTENT_STATUS:
|
class CONTENT_STATUS:
|
||||||
IN_PROGRESS = 0
|
IN_PROGRESS = 0
|
||||||
COMPLETE = 1
|
COMPLETE = 1
|
||||||
@@ -382,6 +354,7 @@ class AUTOCOMPLETE_TYPE:
|
|||||||
SQL = 0
|
SQL = 0
|
||||||
OS = 1
|
OS = 1
|
||||||
SQLMAP = 2
|
SQLMAP = 2
|
||||||
|
API = 3
|
||||||
|
|
||||||
class NOTE:
|
class NOTE:
|
||||||
FALSE_POSITIVE_OR_UNEXPLOITABLE = "false positive or unexploitable"
|
FALSE_POSITIVE_OR_UNEXPLOITABLE = "false positive or unexploitable"
|
||||||
|
|||||||
@@ -5,12 +5,10 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
|||||||
See the file 'LICENSE' for copying permission
|
See the file 'LICENSE' for copying permission
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import binascii
|
|
||||||
import cookielib
|
import cookielib
|
||||||
import glob
|
import glob
|
||||||
import inspect
|
import inspect
|
||||||
import logging
|
import logging
|
||||||
import httplib
|
|
||||||
import os
|
import os
|
||||||
import random
|
import random
|
||||||
import re
|
import re
|
||||||
@@ -37,17 +35,15 @@ from lib.core.common import checkFile
|
|||||||
from lib.core.common import dataToStdout
|
from lib.core.common import dataToStdout
|
||||||
from lib.core.common import getPublicTypeMembers
|
from lib.core.common import getPublicTypeMembers
|
||||||
from lib.core.common import getSafeExString
|
from lib.core.common import getSafeExString
|
||||||
from lib.core.common import extractRegexResult
|
|
||||||
from lib.core.common import filterStringValue
|
|
||||||
from lib.core.common import findLocalPort
|
from lib.core.common import findLocalPort
|
||||||
from lib.core.common import findPageForms
|
from lib.core.common import findPageForms
|
||||||
from lib.core.common import getConsoleWidth
|
from lib.core.common import getConsoleWidth
|
||||||
from lib.core.common import getFileItems
|
from lib.core.common import getFileItems
|
||||||
from lib.core.common import getFileType
|
from lib.core.common import getFileType
|
||||||
from lib.core.common import getUnicode
|
|
||||||
from lib.core.common import normalizePath
|
from lib.core.common import normalizePath
|
||||||
from lib.core.common import ntToPosixSlashes
|
from lib.core.common import ntToPosixSlashes
|
||||||
from lib.core.common import openFile
|
from lib.core.common import openFile
|
||||||
|
from lib.core.common import parseRequestFile
|
||||||
from lib.core.common import parseTargetDirect
|
from lib.core.common import parseTargetDirect
|
||||||
from lib.core.common import parseTargetUrl
|
from lib.core.common import parseTargetUrl
|
||||||
from lib.core.common import paths
|
from lib.core.common import paths
|
||||||
@@ -100,10 +96,7 @@ from lib.core.exception import SqlmapUnsupportedDBMSException
|
|||||||
from lib.core.exception import SqlmapUserQuitException
|
from lib.core.exception import SqlmapUserQuitException
|
||||||
from lib.core.log import FORMATTER
|
from lib.core.log import FORMATTER
|
||||||
from lib.core.optiondict import optDict
|
from lib.core.optiondict import optDict
|
||||||
from lib.core.settings import BURP_REQUEST_REGEX
|
|
||||||
from lib.core.settings import BURP_XML_HISTORY_REGEX
|
|
||||||
from lib.core.settings import CODECS_LIST_PAGE
|
from lib.core.settings import CODECS_LIST_PAGE
|
||||||
from lib.core.settings import CRAWL_EXCLUDE_EXTENSIONS
|
|
||||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||||
from lib.core.settings import DBMS_ALIASES
|
from lib.core.settings import DBMS_ALIASES
|
||||||
from lib.core.settings import DEFAULT_PAGE_ENCODING
|
from lib.core.settings import DEFAULT_PAGE_ENCODING
|
||||||
@@ -120,7 +113,6 @@ from lib.core.settings import MAX_NUMBER_OF_THREADS
|
|||||||
from lib.core.settings import NULL
|
from lib.core.settings import NULL
|
||||||
from lib.core.settings import PARAMETER_SPLITTING_REGEX
|
from lib.core.settings import PARAMETER_SPLITTING_REGEX
|
||||||
from lib.core.settings import PRECONNECT_CANDIDATE_TIMEOUT
|
from lib.core.settings import PRECONNECT_CANDIDATE_TIMEOUT
|
||||||
from lib.core.settings import PROBLEMATIC_CUSTOM_INJECTION_PATTERNS
|
|
||||||
from lib.core.settings import SITE
|
from lib.core.settings import SITE
|
||||||
from lib.core.settings import SOCKET_PRE_CONNECT_QUEUE_SIZE
|
from lib.core.settings import SOCKET_PRE_CONNECT_QUEUE_SIZE
|
||||||
from lib.core.settings import SQLMAP_ENVIRONMENT_PREFIX
|
from lib.core.settings import SQLMAP_ENVIRONMENT_PREFIX
|
||||||
@@ -132,7 +124,6 @@ from lib.core.settings import UNION_CHAR_REGEX
|
|||||||
from lib.core.settings import UNKNOWN_DBMS_VERSION
|
from lib.core.settings import UNKNOWN_DBMS_VERSION
|
||||||
from lib.core.settings import URI_INJECTABLE_REGEX
|
from lib.core.settings import URI_INJECTABLE_REGEX
|
||||||
from lib.core.settings import VERSION_STRING
|
from lib.core.settings import VERSION_STRING
|
||||||
from lib.core.settings import WEBSCARAB_SPLITTER
|
|
||||||
from lib.core.threads import getCurrentThreadData
|
from lib.core.threads import getCurrentThreadData
|
||||||
from lib.core.threads import setDaemon
|
from lib.core.threads import setDaemon
|
||||||
from lib.core.update import update
|
from lib.core.update import update
|
||||||
@@ -174,201 +165,6 @@ try:
|
|||||||
except NameError:
|
except NameError:
|
||||||
WindowsError = None
|
WindowsError = None
|
||||||
|
|
||||||
def _feedTargetsDict(reqFile, addedTargetUrls):
|
|
||||||
"""
|
|
||||||
Parses web scarab and burp logs and adds results to the target URL list
|
|
||||||
"""
|
|
||||||
|
|
||||||
def _parseWebScarabLog(content):
|
|
||||||
"""
|
|
||||||
Parses web scarab logs (POST method not supported)
|
|
||||||
"""
|
|
||||||
|
|
||||||
reqResList = content.split(WEBSCARAB_SPLITTER)
|
|
||||||
|
|
||||||
for request in reqResList:
|
|
||||||
url = extractRegexResult(r"URL: (?P<result>.+?)\n", request, re.I)
|
|
||||||
method = extractRegexResult(r"METHOD: (?P<result>.+?)\n", request, re.I)
|
|
||||||
cookie = extractRegexResult(r"COOKIE: (?P<result>.+?)\n", request, re.I)
|
|
||||||
|
|
||||||
if not method or not url:
|
|
||||||
logger.debug("not a valid WebScarab log data")
|
|
||||||
continue
|
|
||||||
|
|
||||||
if method.upper() == HTTPMETHOD.POST:
|
|
||||||
warnMsg = "POST requests from WebScarab logs aren't supported "
|
|
||||||
warnMsg += "as their body content is stored in separate files. "
|
|
||||||
warnMsg += "Nevertheless you can use -r to load them individually."
|
|
||||||
logger.warning(warnMsg)
|
|
||||||
continue
|
|
||||||
|
|
||||||
if not(conf.scope and not re.search(conf.scope, url, re.I)):
|
|
||||||
if not kb.targets or url not in addedTargetUrls:
|
|
||||||
kb.targets.add((url, method, None, cookie, None))
|
|
||||||
addedTargetUrls.add(url)
|
|
||||||
|
|
||||||
def _parseBurpLog(content):
|
|
||||||
"""
|
|
||||||
Parses burp logs
|
|
||||||
"""
|
|
||||||
|
|
||||||
if not re.search(BURP_REQUEST_REGEX, content, re.I | re.S):
|
|
||||||
if re.search(BURP_XML_HISTORY_REGEX, content, re.I | re.S):
|
|
||||||
reqResList = []
|
|
||||||
for match in re.finditer(BURP_XML_HISTORY_REGEX, content, re.I | re.S):
|
|
||||||
port, request = match.groups()
|
|
||||||
try:
|
|
||||||
request = request.decode("base64")
|
|
||||||
except binascii.Error:
|
|
||||||
continue
|
|
||||||
_ = re.search(r"%s:.+" % re.escape(HTTP_HEADER.HOST), request)
|
|
||||||
if _:
|
|
||||||
host = _.group(0).strip()
|
|
||||||
if not re.search(r":\d+\Z", host):
|
|
||||||
request = request.replace(host, "%s:%d" % (host, int(port)))
|
|
||||||
reqResList.append(request)
|
|
||||||
else:
|
|
||||||
reqResList = [content]
|
|
||||||
else:
|
|
||||||
reqResList = re.finditer(BURP_REQUEST_REGEX, content, re.I | re.S)
|
|
||||||
|
|
||||||
for match in reqResList:
|
|
||||||
request = match if isinstance(match, basestring) else match.group(0)
|
|
||||||
request = re.sub(r"\A[^\w]+", "", request)
|
|
||||||
|
|
||||||
schemePort = re.search(r"(http[\w]*)\:\/\/.*?\:([\d]+).+?={10,}", request, re.I | re.S)
|
|
||||||
|
|
||||||
if schemePort:
|
|
||||||
scheme = schemePort.group(1)
|
|
||||||
port = schemePort.group(2)
|
|
||||||
request = re.sub(r"\n=+\Z", "", request.split(schemePort.group(0))[-1].lstrip())
|
|
||||||
else:
|
|
||||||
scheme, port = None, None
|
|
||||||
|
|
||||||
if not re.search(r"^[\n]*(%s).*?\sHTTP\/" % "|".join(getPublicTypeMembers(HTTPMETHOD, True)), request, re.I | re.M):
|
|
||||||
continue
|
|
||||||
|
|
||||||
if re.search(r"^[\n]*%s.*?\.(%s)\sHTTP\/" % (HTTPMETHOD.GET, "|".join(CRAWL_EXCLUDE_EXTENSIONS)), request, re.I | re.M):
|
|
||||||
continue
|
|
||||||
|
|
||||||
getPostReq = False
|
|
||||||
url = None
|
|
||||||
host = None
|
|
||||||
method = None
|
|
||||||
data = None
|
|
||||||
cookie = None
|
|
||||||
params = False
|
|
||||||
newline = None
|
|
||||||
lines = request.split('\n')
|
|
||||||
headers = []
|
|
||||||
|
|
||||||
for index in xrange(len(lines)):
|
|
||||||
line = lines[index]
|
|
||||||
|
|
||||||
if not line.strip() and index == len(lines) - 1:
|
|
||||||
break
|
|
||||||
|
|
||||||
newline = "\r\n" if line.endswith('\r') else '\n'
|
|
||||||
line = line.strip('\r')
|
|
||||||
match = re.search(r"\A(%s) (.+) HTTP/[\d.]+\Z" % "|".join(getPublicTypeMembers(HTTPMETHOD, True)), line) if not method else None
|
|
||||||
|
|
||||||
if len(line.strip()) == 0 and method and method != HTTPMETHOD.GET and data is None:
|
|
||||||
data = ""
|
|
||||||
params = True
|
|
||||||
|
|
||||||
elif match:
|
|
||||||
method = match.group(1)
|
|
||||||
url = match.group(2)
|
|
||||||
|
|
||||||
if any(_ in line for _ in ('?', '=', kb.customInjectionMark)):
|
|
||||||
params = True
|
|
||||||
|
|
||||||
getPostReq = True
|
|
||||||
|
|
||||||
# POST parameters
|
|
||||||
elif data is not None and params:
|
|
||||||
data += "%s%s" % (line, newline)
|
|
||||||
|
|
||||||
# GET parameters
|
|
||||||
elif "?" in line and "=" in line and ": " not in line:
|
|
||||||
params = True
|
|
||||||
|
|
||||||
# Headers
|
|
||||||
elif re.search(r"\A\S+:", line):
|
|
||||||
key, value = line.split(":", 1)
|
|
||||||
value = value.strip().replace("\r", "").replace("\n", "")
|
|
||||||
|
|
||||||
# Cookie and Host headers
|
|
||||||
if key.upper() == HTTP_HEADER.COOKIE.upper():
|
|
||||||
cookie = value
|
|
||||||
elif key.upper() == HTTP_HEADER.HOST.upper():
|
|
||||||
if '://' in value:
|
|
||||||
scheme, value = value.split('://')[:2]
|
|
||||||
splitValue = value.split(":")
|
|
||||||
host = splitValue[0]
|
|
||||||
|
|
||||||
if len(splitValue) > 1:
|
|
||||||
port = filterStringValue(splitValue[1], "[0-9]")
|
|
||||||
|
|
||||||
# Avoid to add a static content length header to
|
|
||||||
# headers and consider the following lines as
|
|
||||||
# POSTed data
|
|
||||||
if key.upper() == HTTP_HEADER.CONTENT_LENGTH.upper():
|
|
||||||
params = True
|
|
||||||
|
|
||||||
# Avoid proxy and connection type related headers
|
|
||||||
elif key not in (HTTP_HEADER.PROXY_CONNECTION, HTTP_HEADER.CONNECTION):
|
|
||||||
headers.append((getUnicode(key), getUnicode(value)))
|
|
||||||
|
|
||||||
if kb.customInjectionMark in re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value or ""):
|
|
||||||
params = True
|
|
||||||
|
|
||||||
data = data.rstrip("\r\n") if data else data
|
|
||||||
|
|
||||||
if getPostReq and (params or cookie):
|
|
||||||
if not port and isinstance(scheme, basestring) and scheme.lower() == "https":
|
|
||||||
port = "443"
|
|
||||||
elif not scheme and port == "443":
|
|
||||||
scheme = "https"
|
|
||||||
|
|
||||||
if conf.forceSSL:
|
|
||||||
scheme = "https"
|
|
||||||
port = port or "443"
|
|
||||||
|
|
||||||
if not host:
|
|
||||||
errMsg = "invalid format of a request file"
|
|
||||||
raise SqlmapSyntaxException(errMsg)
|
|
||||||
|
|
||||||
if not url.startswith("http"):
|
|
||||||
url = "%s://%s:%s%s" % (scheme or "http", host, port or "80", url)
|
|
||||||
scheme = None
|
|
||||||
port = None
|
|
||||||
|
|
||||||
if not(conf.scope and not re.search(conf.scope, url, re.I)):
|
|
||||||
if not kb.targets or url not in addedTargetUrls:
|
|
||||||
kb.targets.add((url, conf.method or method, data, cookie, tuple(headers)))
|
|
||||||
addedTargetUrls.add(url)
|
|
||||||
|
|
||||||
checkFile(reqFile)
|
|
||||||
try:
|
|
||||||
with openFile(reqFile, "rb") as f:
|
|
||||||
content = f.read()
|
|
||||||
except (IOError, OSError, MemoryError), ex:
|
|
||||||
errMsg = "something went wrong while trying "
|
|
||||||
errMsg += "to read the content of file '%s' ('%s')" % (reqFile, getSafeExString(ex))
|
|
||||||
raise SqlmapSystemException(errMsg)
|
|
||||||
|
|
||||||
if conf.scope:
|
|
||||||
logger.info("using regular expression '%s' for filtering targets" % conf.scope)
|
|
||||||
|
|
||||||
_parseBurpLog(content)
|
|
||||||
_parseWebScarabLog(content)
|
|
||||||
|
|
||||||
if not addedTargetUrls:
|
|
||||||
errMsg = "unable to find usable request(s) "
|
|
||||||
errMsg += "in provided file ('%s')" % reqFile
|
|
||||||
raise SqlmapGenericException(errMsg)
|
|
||||||
|
|
||||||
def _loadQueries():
|
def _loadQueries():
|
||||||
"""
|
"""
|
||||||
Loads queries from 'xml/queries.xml' file.
|
Loads queries from 'xml/queries.xml' file.
|
||||||
@@ -414,7 +210,7 @@ def _setMultipleTargets():
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
initialTargetsCount = len(kb.targets)
|
initialTargetsCount = len(kb.targets)
|
||||||
addedTargetUrls = set()
|
seen = set()
|
||||||
|
|
||||||
if not conf.logFile:
|
if not conf.logFile:
|
||||||
return
|
return
|
||||||
@@ -427,7 +223,11 @@ def _setMultipleTargets():
|
|||||||
raise SqlmapFilePathException(errMsg)
|
raise SqlmapFilePathException(errMsg)
|
||||||
|
|
||||||
if os.path.isfile(conf.logFile):
|
if os.path.isfile(conf.logFile):
|
||||||
_feedTargetsDict(conf.logFile, addedTargetUrls)
|
for target in parseRequestFile(conf.logFile):
|
||||||
|
url = target[0]
|
||||||
|
if url not in seen:
|
||||||
|
kb.targets.add(target)
|
||||||
|
seen.add(url)
|
||||||
|
|
||||||
elif os.path.isdir(conf.logFile):
|
elif os.path.isdir(conf.logFile):
|
||||||
files = os.listdir(conf.logFile)
|
files = os.listdir(conf.logFile)
|
||||||
@@ -437,7 +237,11 @@ def _setMultipleTargets():
|
|||||||
if not re.search(r"([\d]+)\-request", reqFile):
|
if not re.search(r"([\d]+)\-request", reqFile):
|
||||||
continue
|
continue
|
||||||
|
|
||||||
_feedTargetsDict(os.path.join(conf.logFile, reqFile), addedTargetUrls)
|
for target in parseRequestFile(os.path.join(conf.logFile, reqFile)):
|
||||||
|
url = target[0]
|
||||||
|
if url not in seen:
|
||||||
|
kb.targets.add(target)
|
||||||
|
seen.add(url)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
errMsg = "the specified list of targets is not a file "
|
errMsg = "the specified list of targets is not a file "
|
||||||
@@ -478,22 +282,37 @@ def _setRequestFromFile():
|
|||||||
textual file, parses it and saves the information into the knowledge base.
|
textual file, parses it and saves the information into the knowledge base.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if not conf.requestFile:
|
if conf.requestFile:
|
||||||
return
|
conf.requestFile = safeExpandUser(conf.requestFile)
|
||||||
|
seen = set()
|
||||||
|
|
||||||
addedTargetUrls = set()
|
if not os.path.isfile(conf.requestFile):
|
||||||
|
errMsg = "specified HTTP request file '%s' " % conf.requestFile
|
||||||
|
errMsg += "does not exist"
|
||||||
|
raise SqlmapFilePathException(errMsg)
|
||||||
|
|
||||||
conf.requestFile = safeExpandUser(conf.requestFile)
|
infoMsg = "parsing HTTP request from '%s'" % conf.requestFile
|
||||||
|
logger.info(infoMsg)
|
||||||
|
|
||||||
if not os.path.isfile(conf.requestFile):
|
for target in parseRequestFile(conf.requestFile):
|
||||||
errMsg = "specified HTTP request file '%s' " % conf.requestFile
|
url = target[0]
|
||||||
errMsg += "does not exist"
|
if url not in seen:
|
||||||
raise SqlmapFilePathException(errMsg)
|
kb.targets.add(target)
|
||||||
|
seen.add(url)
|
||||||
|
|
||||||
infoMsg = "parsing HTTP request from '%s'" % conf.requestFile
|
if conf.secondReq:
|
||||||
logger.info(infoMsg)
|
conf.secondReq = safeExpandUser(conf.secondReq)
|
||||||
|
|
||||||
_feedTargetsDict(conf.requestFile, addedTargetUrls)
|
if not os.path.isfile(conf.secondReq):
|
||||||
|
errMsg = "specified second-order HTTP request file '%s' " % conf.secondReq
|
||||||
|
errMsg += "does not exist"
|
||||||
|
raise SqlmapFilePathException(errMsg)
|
||||||
|
|
||||||
|
infoMsg = "parsing second-order HTTP request from '%s'" % conf.secondReq
|
||||||
|
logger.info(infoMsg)
|
||||||
|
|
||||||
|
target = parseRequestFile(conf.secondReq, False).next()
|
||||||
|
kb.secondReq = target
|
||||||
|
|
||||||
def _setCrawler():
|
def _setCrawler():
|
||||||
if not conf.crawlDepth:
|
if not conf.crawlDepth:
|
||||||
@@ -1590,7 +1409,7 @@ def _createTemporaryDirectory():
|
|||||||
try:
|
try:
|
||||||
if not os.path.isdir(tempfile.gettempdir()):
|
if not os.path.isdir(tempfile.gettempdir()):
|
||||||
os.makedirs(tempfile.gettempdir())
|
os.makedirs(tempfile.gettempdir())
|
||||||
except (OSError, IOError, WindowsError), ex:
|
except Exception, ex:
|
||||||
warnMsg = "there has been a problem while accessing "
|
warnMsg = "there has been a problem while accessing "
|
||||||
warnMsg += "system's temporary directory location(s) ('%s'). Please " % getSafeExString(ex)
|
warnMsg += "system's temporary directory location(s) ('%s'). Please " % getSafeExString(ex)
|
||||||
warnMsg += "make sure that there is enough disk space left. If problem persists, "
|
warnMsg += "make sure that there is enough disk space left. If problem persists, "
|
||||||
@@ -1601,7 +1420,7 @@ def _createTemporaryDirectory():
|
|||||||
if "sqlmap" not in (tempfile.tempdir or "") or conf.tmpDir and tempfile.tempdir == conf.tmpDir:
|
if "sqlmap" not in (tempfile.tempdir or "") or conf.tmpDir and tempfile.tempdir == conf.tmpDir:
|
||||||
try:
|
try:
|
||||||
tempfile.tempdir = tempfile.mkdtemp(prefix="sqlmap", suffix=str(os.getpid()))
|
tempfile.tempdir = tempfile.mkdtemp(prefix="sqlmap", suffix=str(os.getpid()))
|
||||||
except (OSError, IOError, WindowsError):
|
except:
|
||||||
tempfile.tempdir = os.path.join(paths.SQLMAP_HOME_PATH, "tmp", "sqlmap%s%d" % (randomStr(6), os.getpid()))
|
tempfile.tempdir = os.path.join(paths.SQLMAP_HOME_PATH, "tmp", "sqlmap%s%d" % (randomStr(6), os.getpid()))
|
||||||
|
|
||||||
kb.tempDir = tempfile.tempdir
|
kb.tempDir = tempfile.tempdir
|
||||||
@@ -1609,7 +1428,7 @@ def _createTemporaryDirectory():
|
|||||||
if not os.path.isdir(tempfile.tempdir):
|
if not os.path.isdir(tempfile.tempdir):
|
||||||
try:
|
try:
|
||||||
os.makedirs(tempfile.tempdir)
|
os.makedirs(tempfile.tempdir)
|
||||||
except (OSError, IOError, WindowsError), ex:
|
except Exception, ex:
|
||||||
errMsg = "there has been a problem while setting "
|
errMsg = "there has been a problem while setting "
|
||||||
errMsg += "temporary directory location ('%s')" % getSafeExString(ex)
|
errMsg += "temporary directory location ('%s')" % getSafeExString(ex)
|
||||||
raise SqlmapSystemException(errMsg)
|
raise SqlmapSystemException(errMsg)
|
||||||
@@ -1722,7 +1541,7 @@ def _cleanupOptions():
|
|||||||
|
|
||||||
if conf.testFilter:
|
if conf.testFilter:
|
||||||
conf.testFilter = conf.testFilter.strip('*+')
|
conf.testFilter = conf.testFilter.strip('*+')
|
||||||
conf.testFilter = re.sub(r"([^.])([*+])", "\g<1>.\g<2>", conf.testFilter)
|
conf.testFilter = re.sub(r"([^.])([*+])", r"\g<1>.\g<2>", conf.testFilter)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
re.compile(conf.testFilter)
|
re.compile(conf.testFilter)
|
||||||
@@ -1731,7 +1550,7 @@ def _cleanupOptions():
|
|||||||
|
|
||||||
if conf.testSkip:
|
if conf.testSkip:
|
||||||
conf.testSkip = conf.testSkip.strip('*+')
|
conf.testSkip = conf.testSkip.strip('*+')
|
||||||
conf.testSkip = re.sub(r"([^.])([*+])", "\g<1>.\g<2>", conf.testSkip)
|
conf.testSkip = re.sub(r"([^.])([*+])", r"\g<1>.\g<2>", conf.testSkip)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
re.compile(conf.testSkip)
|
re.compile(conf.testSkip)
|
||||||
@@ -1802,6 +1621,9 @@ def _cleanupOptions():
|
|||||||
if any((conf.proxy, conf.proxyFile, conf.tor)):
|
if any((conf.proxy, conf.proxyFile, conf.tor)):
|
||||||
conf.disablePrecon = True
|
conf.disablePrecon = True
|
||||||
|
|
||||||
|
if conf.dummy:
|
||||||
|
conf.batch = True
|
||||||
|
|
||||||
threadData = getCurrentThreadData()
|
threadData = getCurrentThreadData()
|
||||||
threadData.reset()
|
threadData.reset()
|
||||||
|
|
||||||
@@ -1816,23 +1638,13 @@ def _cleanupEnvironment():
|
|||||||
if hasattr(socket, "_ready"):
|
if hasattr(socket, "_ready"):
|
||||||
socket._ready.clear()
|
socket._ready.clear()
|
||||||
|
|
||||||
def _dirtyPatches():
|
def _purge():
|
||||||
"""
|
"""
|
||||||
Place for "dirty" Python related patches
|
Safely removes (purges) sqlmap data directory.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
httplib._MAXLINE = 1 * 1024 * 1024 # accept overly long result lines (e.g. SQLi results in HTTP header responses)
|
if conf.purge:
|
||||||
|
purge(paths.SQLMAP_HOME_PATH)
|
||||||
if IS_WIN:
|
|
||||||
from thirdparty.wininetpton import win_inet_pton # add support for inet_pton() on Windows OS
|
|
||||||
|
|
||||||
def _purgeOutput():
|
|
||||||
"""
|
|
||||||
Safely removes (purges) output directory.
|
|
||||||
"""
|
|
||||||
|
|
||||||
if conf.purgeOutput:
|
|
||||||
purge(paths.SQLMAP_OUTPUT_PATH)
|
|
||||||
|
|
||||||
def _setConfAttributes():
|
def _setConfAttributes():
|
||||||
"""
|
"""
|
||||||
@@ -2022,6 +1834,7 @@ def _setKnowledgeBaseAttributes(flushAll=True):
|
|||||||
kb.rowXmlMode = False
|
kb.rowXmlMode = False
|
||||||
kb.safeCharEncode = False
|
kb.safeCharEncode = False
|
||||||
kb.safeReq = AttribDict()
|
kb.safeReq = AttribDict()
|
||||||
|
kb.secondReq = None
|
||||||
kb.singleLogFlags = set()
|
kb.singleLogFlags = set()
|
||||||
kb.skipSeqMatcher = False
|
kb.skipSeqMatcher = False
|
||||||
kb.reduceTests = None
|
kb.reduceTests = None
|
||||||
@@ -2411,6 +2224,10 @@ def _basicOptionValidation():
|
|||||||
errMsg = "switch '--eta' is incompatible with option '-v'"
|
errMsg = "switch '--eta' is incompatible with option '-v'"
|
||||||
raise SqlmapSyntaxException(errMsg)
|
raise SqlmapSyntaxException(errMsg)
|
||||||
|
|
||||||
|
if conf.secondUrl and conf.secondReq:
|
||||||
|
errMsg = "option '--second-url' is incompatible with option '--second-req')"
|
||||||
|
raise SqlmapSyntaxException(errMsg)
|
||||||
|
|
||||||
if conf.direct and conf.url:
|
if conf.direct and conf.url:
|
||||||
errMsg = "option '-d' is incompatible with option '-u' ('--url')"
|
errMsg = "option '-d' is incompatible with option '-u' ('--url')"
|
||||||
raise SqlmapSyntaxException(errMsg)
|
raise SqlmapSyntaxException(errMsg)
|
||||||
@@ -2633,8 +2450,7 @@ def init():
|
|||||||
_setRequestFromFile()
|
_setRequestFromFile()
|
||||||
_cleanupOptions()
|
_cleanupOptions()
|
||||||
_cleanupEnvironment()
|
_cleanupEnvironment()
|
||||||
_dirtyPatches()
|
_purge()
|
||||||
_purgeOutput()
|
|
||||||
_checkDependencies()
|
_checkDependencies()
|
||||||
_createTemporaryDirectory()
|
_createTemporaryDirectory()
|
||||||
_basicOptionValidation()
|
_basicOptionValidation()
|
||||||
|
|||||||
@@ -109,7 +109,8 @@ optDict = {
|
|||||||
"uChar": "string",
|
"uChar": "string",
|
||||||
"uFrom": "string",
|
"uFrom": "string",
|
||||||
"dnsDomain": "string",
|
"dnsDomain": "string",
|
||||||
"secondOrder": "string",
|
"secondUrl": "string",
|
||||||
|
"secondReq": "string",
|
||||||
},
|
},
|
||||||
|
|
||||||
"Fingerprint": {
|
"Fingerprint": {
|
||||||
@@ -228,7 +229,7 @@ optDict = {
|
|||||||
"identifyWaf": "boolean",
|
"identifyWaf": "boolean",
|
||||||
"mobile": "boolean",
|
"mobile": "boolean",
|
||||||
"offline": "boolean",
|
"offline": "boolean",
|
||||||
"purgeOutput": "boolean",
|
"purge": "boolean",
|
||||||
"skipWaf": "boolean",
|
"skipWaf": "boolean",
|
||||||
"smart": "boolean",
|
"smart": "boolean",
|
||||||
"tmpDir": "string",
|
"tmpDir": "string",
|
||||||
|
|||||||
26
lib/core/patch.py
Normal file
26
lib/core/patch.py
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
|
||||||
|
"""
|
||||||
|
Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
||||||
|
See the file 'LICENSE' for copying permission
|
||||||
|
"""
|
||||||
|
|
||||||
|
import codecs
|
||||||
|
import httplib
|
||||||
|
|
||||||
|
from lib.core.settings import IS_WIN
|
||||||
|
|
||||||
|
def dirtyPatches():
|
||||||
|
"""
|
||||||
|
Place for "dirty" Python related patches
|
||||||
|
"""
|
||||||
|
|
||||||
|
# accept overly long result lines (e.g. SQLi results in HTTP header responses)
|
||||||
|
httplib._MAXLINE = 1 * 1024 * 1024
|
||||||
|
|
||||||
|
# add support for inet_pton() on Windows OS
|
||||||
|
if IS_WIN:
|
||||||
|
from thirdparty.wininetpton import win_inet_pton
|
||||||
|
|
||||||
|
# Reference: https://github.com/nodejs/node/issues/12786#issuecomment-298652440
|
||||||
|
codecs.register(lambda name: codecs.lookup("utf-8") if name == "cp65001" else None)
|
||||||
@@ -50,7 +50,7 @@ def profile(profileOutputFile=None, dotOutputFile=None, imageOutputFile=None):
|
|||||||
if os.path.exists(imageOutputFile):
|
if os.path.exists(imageOutputFile):
|
||||||
os.remove(imageOutputFile)
|
os.remove(imageOutputFile)
|
||||||
|
|
||||||
infoMsg = "profiling the execution into file %s" % profileOutputFile
|
infoMsg = "profiling the execution into file '%s'" % profileOutputFile
|
||||||
logger.info(infoMsg)
|
logger.info(infoMsg)
|
||||||
|
|
||||||
# Start sqlmap main function and generate a raw profile file
|
# Start sqlmap main function and generate a raw profile file
|
||||||
@@ -80,15 +80,20 @@ def profile(profileOutputFile=None, dotOutputFile=None, imageOutputFile=None):
|
|||||||
if isinstance(pydotGraph, list):
|
if isinstance(pydotGraph, list):
|
||||||
pydotGraph = pydotGraph[0]
|
pydotGraph = pydotGraph[0]
|
||||||
|
|
||||||
pydotGraph.write_png(imageOutputFile)
|
try:
|
||||||
|
pydotGraph.write_png(imageOutputFile)
|
||||||
|
except OSError:
|
||||||
|
errMsg = "profiling requires graphviz installed "
|
||||||
|
errMsg += "(Hint: 'sudo apt-get install graphviz')"
|
||||||
|
logger.error(errMsg)
|
||||||
|
else:
|
||||||
|
infoMsg = "displaying interactive graph with xdot library"
|
||||||
|
logger.info(infoMsg)
|
||||||
|
|
||||||
infoMsg = "displaying interactive graph with xdot library"
|
# Display interactive Graphviz dot file by using extra/xdot/xdot.py
|
||||||
logger.info(infoMsg)
|
# http://code.google.com/p/jrfonseca/wiki/XDot
|
||||||
|
win = xdot.DotWindow()
|
||||||
# Display interactive Graphviz dot file by using extra/xdot/xdot.py
|
win.connect('destroy', gtk.main_quit)
|
||||||
# http://code.google.com/p/jrfonseca/wiki/XDot
|
win.set_filter("dot")
|
||||||
win = xdot.DotWindow()
|
win.open_file(dotOutputFile)
|
||||||
win.connect('destroy', gtk.main_quit)
|
gtk.main()
|
||||||
win.set_filter("dot")
|
|
||||||
win.open_file(dotOutputFile)
|
|
||||||
gtk.main()
|
|
||||||
|
|||||||
@@ -19,7 +19,7 @@ from lib.core.enums import DBMS_DIRECTORY_NAME
|
|||||||
from lib.core.enums import OS
|
from lib.core.enums import OS
|
||||||
|
|
||||||
# sqlmap version (<major>.<minor>.<month>.<monthly commit>)
|
# sqlmap version (<major>.<minor>.<month>.<monthly commit>)
|
||||||
VERSION = "1.2.5.0"
|
VERSION = "1.2.7.0"
|
||||||
TYPE = "dev" if VERSION.count('.') > 2 and VERSION.split('.')[-1] != '0' else "stable"
|
TYPE = "dev" if VERSION.count('.') > 2 and VERSION.split('.')[-1] != '0' else "stable"
|
||||||
TYPE_COLORS = {"dev": 33, "stable": 90, "pip": 34}
|
TYPE_COLORS = {"dev": 33, "stable": 90, "pip": 34}
|
||||||
VERSION_STRING = "sqlmap/%s#%s" % ('.'.join(VERSION.split('.')[:-1]) if VERSION.count('.') > 2 and VERSION.split('.')[-1] == '0' else VERSION, TYPE)
|
VERSION_STRING = "sqlmap/%s#%s" % ('.'.join(VERSION.split('.')[:-1]) if VERSION.count('.') > 2 and VERSION.split('.')[-1] == '0' else VERSION, TYPE)
|
||||||
@@ -364,10 +364,10 @@ URI_HTTP_HEADER = "URI"
|
|||||||
URI_INJECTABLE_REGEX = r"//[^/]*/([^\.*?]+)\Z"
|
URI_INJECTABLE_REGEX = r"//[^/]*/([^\.*?]+)\Z"
|
||||||
|
|
||||||
# Regex used for masking sensitive data
|
# Regex used for masking sensitive data
|
||||||
SENSITIVE_DATA_REGEX = "(\s|=)(?P<result>[^\s=]*%s[^\s]*)\s"
|
SENSITIVE_DATA_REGEX = r"(\s|=)(?P<result>[^\s=]*%s[^\s]*)\s"
|
||||||
|
|
||||||
# Options to explicitly mask in anonymous (unhandled exception) reports (along with anything carrying the <hostname> inside)
|
# Options to explicitly mask in anonymous (unhandled exception) reports (along with anything carrying the <hostname> inside)
|
||||||
SENSITIVE_OPTIONS = ("hostname", "data", "dnsDomain", "googleDork", "authCred", "proxyCred", "tbl", "db", "col", "user", "cookie", "proxy", "rFile", "wFile", "dFile", "testParameter", "authCred")
|
SENSITIVE_OPTIONS = ("hostname", "answers", "data", "dnsDomain", "googleDork", "authCred", "proxyCred", "tbl", "db", "col", "user", "cookie", "proxy", "rFile", "wFile", "dFile", "testParameter", "authCred")
|
||||||
|
|
||||||
# Maximum number of threads (avoiding connection issues and/or DoS)
|
# Maximum number of threads (avoiding connection issues and/or DoS)
|
||||||
MAX_NUMBER_OF_THREADS = 10
|
MAX_NUMBER_OF_THREADS = 10
|
||||||
@@ -388,7 +388,7 @@ CANDIDATE_SENTENCE_MIN_LENGTH = 10
|
|||||||
CUSTOM_INJECTION_MARK_CHAR = '*'
|
CUSTOM_INJECTION_MARK_CHAR = '*'
|
||||||
|
|
||||||
# Other way to declare injection position
|
# Other way to declare injection position
|
||||||
INJECT_HERE_REGEX = '(?i)%INJECT[_ ]?HERE%'
|
INJECT_HERE_REGEX = r"(?i)%INJECT[_ ]?HERE%"
|
||||||
|
|
||||||
# Minimum chunk length used for retrieving data over error based payloads
|
# Minimum chunk length used for retrieving data over error based payloads
|
||||||
MIN_ERROR_CHUNK_LENGTH = 8
|
MIN_ERROR_CHUNK_LENGTH = 8
|
||||||
@@ -487,7 +487,7 @@ LEGAL_DISCLAIMER = "Usage of sqlmap for attacking targets without prior mutual c
|
|||||||
REFLECTIVE_MISS_THRESHOLD = 20
|
REFLECTIVE_MISS_THRESHOLD = 20
|
||||||
|
|
||||||
# Regular expression used for extracting HTML title
|
# Regular expression used for extracting HTML title
|
||||||
HTML_TITLE_REGEX = "<title>(?P<result>[^<]+)</title>"
|
HTML_TITLE_REGEX = r"<title>(?P<result>[^<]+)</title>"
|
||||||
|
|
||||||
# Table used for Base64 conversion in WordPress hash cracking routine
|
# Table used for Base64 conversion in WordPress hash cracking routine
|
||||||
ITOA64 = "./0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz"
|
ITOA64 = "./0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz"
|
||||||
@@ -598,7 +598,7 @@ HASHDB_RETRIEVE_RETRIES = 3
|
|||||||
HASHDB_END_TRANSACTION_RETRIES = 3
|
HASHDB_END_TRANSACTION_RETRIES = 3
|
||||||
|
|
||||||
# Unique milestone value used for forced deprecation of old HashDB values (e.g. when changing hash/pickle mechanism)
|
# Unique milestone value used for forced deprecation of old HashDB values (e.g. when changing hash/pickle mechanism)
|
||||||
HASHDB_MILESTONE_VALUE = "dPHoJRQYvs" # python -c 'import random, string; print "".join(random.sample(string.ascii_letters, 10))'
|
HASHDB_MILESTONE_VALUE = "BZzRotigLX" # python -c 'import random, string; print "".join(random.sample(string.ascii_letters, 10))'
|
||||||
|
|
||||||
# Warn user of possible delay due to large page dump in full UNION query injections
|
# Warn user of possible delay due to large page dump in full UNION query injections
|
||||||
LARGE_OUTPUT_THRESHOLD = 1024 ** 2
|
LARGE_OUTPUT_THRESHOLD = 1024 ** 2
|
||||||
@@ -631,7 +631,7 @@ BANNER = re.sub(r"\[.\]", lambda _: "[\033[01;41m%s\033[01;49m]" % random.sample
|
|||||||
DUMMY_NON_SQLI_CHECK_APPENDIX = "<'\">"
|
DUMMY_NON_SQLI_CHECK_APPENDIX = "<'\">"
|
||||||
|
|
||||||
# Regular expression used for recognition of file inclusion errors
|
# Regular expression used for recognition of file inclusion errors
|
||||||
FI_ERROR_REGEX = "(?i)[^\n]{0,100}(no such file|failed (to )?open)[^\n]{0,100}"
|
FI_ERROR_REGEX = r"(?i)[^\n]{0,100}(no such file|failed (to )?open)[^\n]{0,100}"
|
||||||
|
|
||||||
# Length of prefix and suffix used in non-SQLI heuristic checks
|
# Length of prefix and suffix used in non-SQLI heuristic checks
|
||||||
NON_SQLI_CHECK_PREFIX_SUFFIX_LENGTH = 6
|
NON_SQLI_CHECK_PREFIX_SUFFIX_LENGTH = 6
|
||||||
|
|||||||
@@ -60,6 +60,8 @@ def saveHistory(completion=None):
|
|||||||
historyPath = paths.SQL_SHELL_HISTORY
|
historyPath = paths.SQL_SHELL_HISTORY
|
||||||
elif completion == AUTOCOMPLETE_TYPE.OS:
|
elif completion == AUTOCOMPLETE_TYPE.OS:
|
||||||
historyPath = paths.OS_SHELL_HISTORY
|
historyPath = paths.OS_SHELL_HISTORY
|
||||||
|
elif completion == AUTOCOMPLETE_TYPE.API:
|
||||||
|
historyPath = paths.API_SHELL_HISTORY
|
||||||
else:
|
else:
|
||||||
historyPath = paths.SQLMAP_SHELL_HISTORY
|
historyPath = paths.SQLMAP_SHELL_HISTORY
|
||||||
|
|
||||||
@@ -86,6 +88,8 @@ def loadHistory(completion=None):
|
|||||||
historyPath = paths.SQL_SHELL_HISTORY
|
historyPath = paths.SQL_SHELL_HISTORY
|
||||||
elif completion == AUTOCOMPLETE_TYPE.OS:
|
elif completion == AUTOCOMPLETE_TYPE.OS:
|
||||||
historyPath = paths.OS_SHELL_HISTORY
|
historyPath = paths.OS_SHELL_HISTORY
|
||||||
|
elif completion == AUTOCOMPLETE_TYPE.API:
|
||||||
|
historyPath = paths.API_SHELL_HISTORY
|
||||||
else:
|
else:
|
||||||
historyPath = paths.SQLMAP_SHELL_HISTORY
|
historyPath = paths.SQLMAP_SHELL_HISTORY
|
||||||
|
|
||||||
|
|||||||
@@ -83,6 +83,7 @@ def _setRequestParams():
|
|||||||
conf.parameters[None] = "direct connection"
|
conf.parameters[None] = "direct connection"
|
||||||
return
|
return
|
||||||
|
|
||||||
|
hintNames = []
|
||||||
testableParameters = False
|
testableParameters = False
|
||||||
|
|
||||||
# Perform checks on GET parameters
|
# Perform checks on GET parameters
|
||||||
@@ -101,7 +102,6 @@ def _setRequestParams():
|
|||||||
|
|
||||||
if conf.data is not None:
|
if conf.data is not None:
|
||||||
conf.method = HTTPMETHOD.POST if not conf.method or conf.method == HTTPMETHOD.GET else conf.method
|
conf.method = HTTPMETHOD.POST if not conf.method or conf.method == HTTPMETHOD.GET else conf.method
|
||||||
hintNames = []
|
|
||||||
|
|
||||||
def process(match, repl):
|
def process(match, repl):
|
||||||
retVal = match.group(0)
|
retVal = match.group(0)
|
||||||
@@ -148,8 +148,8 @@ def _setRequestParams():
|
|||||||
match = re.search(r'(?P<name>[^"]+)"\s*:\s*\[([^\]]+)\]', conf.data)
|
match = re.search(r'(?P<name>[^"]+)"\s*:\s*\[([^\]]+)\]', conf.data)
|
||||||
if match and not (conf.testParameter and match.group("name") not in conf.testParameter):
|
if match and not (conf.testParameter and match.group("name") not in conf.testParameter):
|
||||||
_ = match.group(2)
|
_ = match.group(2)
|
||||||
_ = re.sub(r'("[^"]+)"', '\g<1>%s"' % kb.customInjectionMark, _)
|
_ = re.sub(r'("[^"]+)"', r'\g<1>%s"' % kb.customInjectionMark, _)
|
||||||
_ = re.sub(r'(\A|,|\s+)(-?\d[\d\.]*\b)', '\g<0>%s' % kb.customInjectionMark, _)
|
_ = re.sub(r'(\A|,|\s+)(-?\d[\d\.]*\b)', r'\g<0>%s' % kb.customInjectionMark, _)
|
||||||
conf.data = conf.data.replace(match.group(0), match.group(0).replace(match.group(2), _))
|
conf.data = conf.data.replace(match.group(0), match.group(0).replace(match.group(2), _))
|
||||||
|
|
||||||
kb.postHint = POST_HINT.JSON
|
kb.postHint = POST_HINT.JSON
|
||||||
@@ -619,33 +619,35 @@ def _createTargetDirs():
|
|||||||
Create the output directory.
|
Create the output directory.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
try:
|
for context in "output", "history":
|
||||||
if not os.path.isdir(paths.SQLMAP_OUTPUT_PATH):
|
directory = paths["SQLMAP_%s_PATH" % context.upper()]
|
||||||
os.makedirs(paths.SQLMAP_OUTPUT_PATH)
|
|
||||||
|
|
||||||
_ = os.path.join(paths.SQLMAP_OUTPUT_PATH, randomStr())
|
|
||||||
open(_, "w+b").close()
|
|
||||||
os.remove(_)
|
|
||||||
|
|
||||||
if conf.outputDir:
|
|
||||||
warnMsg = "using '%s' as the output directory" % paths.SQLMAP_OUTPUT_PATH
|
|
||||||
logger.warn(warnMsg)
|
|
||||||
except (OSError, IOError), ex:
|
|
||||||
try:
|
try:
|
||||||
tempDir = tempfile.mkdtemp(prefix="sqlmapoutput")
|
if not os.path.isdir(directory):
|
||||||
except Exception, _:
|
os.makedirs(directory)
|
||||||
errMsg = "unable to write to the temporary directory ('%s'). " % _
|
|
||||||
errMsg += "Please make sure that your disk is not full and "
|
|
||||||
errMsg += "that you have sufficient write permissions to "
|
|
||||||
errMsg += "create temporary files and/or directories"
|
|
||||||
raise SqlmapSystemException(errMsg)
|
|
||||||
|
|
||||||
warnMsg = "unable to %s output directory " % ("create" if not os.path.isdir(paths.SQLMAP_OUTPUT_PATH) else "write to the")
|
_ = os.path.join(directory, randomStr())
|
||||||
warnMsg += "'%s' (%s). " % (paths.SQLMAP_OUTPUT_PATH, getUnicode(ex))
|
open(_, "w+b").close()
|
||||||
warnMsg += "Using temporary directory '%s' instead" % getUnicode(tempDir)
|
os.remove(_)
|
||||||
logger.warn(warnMsg)
|
|
||||||
|
|
||||||
paths.SQLMAP_OUTPUT_PATH = tempDir
|
if conf.outputDir and context == "output":
|
||||||
|
warnMsg = "using '%s' as the %s directory" % (directory, context)
|
||||||
|
logger.warn(warnMsg)
|
||||||
|
except (OSError, IOError), ex:
|
||||||
|
try:
|
||||||
|
tempDir = tempfile.mkdtemp(prefix="sqlmap%s" % context)
|
||||||
|
except Exception, _:
|
||||||
|
errMsg = "unable to write to the temporary directory ('%s'). " % _
|
||||||
|
errMsg += "Please make sure that your disk is not full and "
|
||||||
|
errMsg += "that you have sufficient write permissions to "
|
||||||
|
errMsg += "create temporary files and/or directories"
|
||||||
|
raise SqlmapSystemException(errMsg)
|
||||||
|
|
||||||
|
warnMsg = "unable to %s %s directory " % ("create" if not os.path.isdir(directory) else "write to the", context)
|
||||||
|
warnMsg += "'%s' (%s). " % (directory, getUnicode(ex))
|
||||||
|
warnMsg += "Using temporary directory '%s' instead" % getUnicode(tempDir)
|
||||||
|
logger.warn(warnMsg)
|
||||||
|
|
||||||
|
paths["SQLMAP_%s_PATH" % context.upper()] = tempDir
|
||||||
|
|
||||||
conf.outputPath = os.path.join(getUnicode(paths.SQLMAP_OUTPUT_PATH), normalizeUnicode(getUnicode(conf.hostname)))
|
conf.outputPath = os.path.join(getUnicode(paths.SQLMAP_OUTPUT_PATH), normalizeUnicode(getUnicode(conf.hostname)))
|
||||||
|
|
||||||
|
|||||||
@@ -49,6 +49,7 @@ def update():
|
|||||||
errMsg = "unable to update content of directory '%s' ('%s')" % (directory, getSafeExString(ex))
|
errMsg = "unable to update content of directory '%s' ('%s')" % (directory, getSafeExString(ex))
|
||||||
logger.error(errMsg)
|
logger.error(errMsg)
|
||||||
else:
|
else:
|
||||||
|
attrs = os.stat(os.path.join(directory, "sqlmap.py")).st_mode
|
||||||
for wildcard in ('*', ".*"):
|
for wildcard in ('*', ".*"):
|
||||||
for _ in glob.glob(os.path.join(directory, wildcard)):
|
for _ in glob.glob(os.path.join(directory, wildcard)):
|
||||||
try:
|
try:
|
||||||
@@ -83,6 +84,11 @@ def update():
|
|||||||
else:
|
else:
|
||||||
if not success:
|
if not success:
|
||||||
logger.error("update could not be completed")
|
logger.error("update could not be completed")
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
os.chmod(os.path.join(directory, "sqlmap.py"), attrs)
|
||||||
|
except OSError:
|
||||||
|
logger.warning("could not set the file attributes of '%s'" % os.path.join(directory, "sqlmap.py"))
|
||||||
else:
|
else:
|
||||||
infoMsg = "updating sqlmap to the latest development revision from the "
|
infoMsg = "updating sqlmap to the latest development revision from the "
|
||||||
infoMsg += "GitHub repository"
|
infoMsg += "GitHub repository"
|
||||||
|
|||||||
@@ -253,13 +253,13 @@ def cmdLineParser(argv=None):
|
|||||||
help="Regexp to exclude parameters from testing (e.g. \"ses\")")
|
help="Regexp to exclude parameters from testing (e.g. \"ses\")")
|
||||||
|
|
||||||
injection.add_option("--dbms", dest="dbms",
|
injection.add_option("--dbms", dest="dbms",
|
||||||
help="Force back-end DBMS to this value")
|
help="Force back-end DBMS to provided value")
|
||||||
|
|
||||||
injection.add_option("--dbms-cred", dest="dbmsCred",
|
injection.add_option("--dbms-cred", dest="dbmsCred",
|
||||||
help="DBMS authentication credentials (user:password)")
|
help="DBMS authentication credentials (user:password)")
|
||||||
|
|
||||||
injection.add_option("--os", dest="os",
|
injection.add_option("--os", dest="os",
|
||||||
help="Force back-end DBMS operating system to this value")
|
help="Force back-end DBMS operating system to provided value")
|
||||||
|
|
||||||
injection.add_option("--invalid-bignum", dest="invalidBignum", action="store_true",
|
injection.add_option("--invalid-bignum", dest="invalidBignum", action="store_true",
|
||||||
help="Use big numbers for invalidating values")
|
help="Use big numbers for invalidating values")
|
||||||
@@ -333,9 +333,12 @@ def cmdLineParser(argv=None):
|
|||||||
techniques.add_option("--dns-domain", dest="dnsDomain",
|
techniques.add_option("--dns-domain", dest="dnsDomain",
|
||||||
help="Domain name used for DNS exfiltration attack")
|
help="Domain name used for DNS exfiltration attack")
|
||||||
|
|
||||||
techniques.add_option("--second-order", dest="secondOrder",
|
techniques.add_option("--second-url", dest="secondUrl",
|
||||||
help="Resulting page URL searched for second-order response")
|
help="Resulting page URL searched for second-order response")
|
||||||
|
|
||||||
|
techniques.add_option("--second-req", dest="secondReq",
|
||||||
|
help="Load second-order HTTP request from file")
|
||||||
|
|
||||||
# Fingerprint options
|
# Fingerprint options
|
||||||
fingerprint = OptionGroup(parser, "Fingerprint")
|
fingerprint = OptionGroup(parser, "Fingerprint")
|
||||||
|
|
||||||
@@ -400,7 +403,7 @@ def cmdLineParser(argv=None):
|
|||||||
help="Search column(s), table(s) and/or database name(s)")
|
help="Search column(s), table(s) and/or database name(s)")
|
||||||
|
|
||||||
enumeration.add_option("--comments", dest="getComments", action="store_true",
|
enumeration.add_option("--comments", dest="getComments", action="store_true",
|
||||||
help="Retrieve DBMS comments")
|
help="Check for DBMS comments during enumeration")
|
||||||
|
|
||||||
enumeration.add_option("-D", dest="db",
|
enumeration.add_option("-D", dest="db",
|
||||||
help="DBMS database to enumerate")
|
help="DBMS database to enumerate")
|
||||||
@@ -581,7 +584,7 @@ def cmdLineParser(argv=None):
|
|||||||
help="Log all HTTP traffic into a HAR file")
|
help="Log all HTTP traffic into a HAR file")
|
||||||
|
|
||||||
general.add_option("--hex", dest="hexConvert", action="store_true",
|
general.add_option("--hex", dest="hexConvert", action="store_true",
|
||||||
help="Use DBMS hex function(s) for data retrieval")
|
help="Use hex conversion during data retrieval")
|
||||||
|
|
||||||
general.add_option("--output-dir", dest="outputDir", action="store",
|
general.add_option("--output-dir", dest="outputDir", action="store",
|
||||||
help="Custom output directory path")
|
help="Custom output directory path")
|
||||||
@@ -640,8 +643,8 @@ def cmdLineParser(argv=None):
|
|||||||
miscellaneous.add_option("--offline", dest="offline", action="store_true",
|
miscellaneous.add_option("--offline", dest="offline", action="store_true",
|
||||||
help="Work in offline mode (only use session data)")
|
help="Work in offline mode (only use session data)")
|
||||||
|
|
||||||
miscellaneous.add_option("--purge-output", dest="purgeOutput", action="store_true",
|
miscellaneous.add_option("--purge", dest="purge", action="store_true",
|
||||||
help="Safely remove all content from output directory")
|
help="Safely remove all content from sqlmap data directory")
|
||||||
|
|
||||||
miscellaneous.add_option("--skip-waf", dest="skipWaf", action="store_true",
|
miscellaneous.add_option("--skip-waf", dest="skipWaf", action="store_true",
|
||||||
help="Skip heuristic detection of WAF/IPS/IDS protection")
|
help="Skip heuristic detection of WAF/IPS/IDS protection")
|
||||||
@@ -871,7 +874,7 @@ def cmdLineParser(argv=None):
|
|||||||
if args.dummy:
|
if args.dummy:
|
||||||
args.url = args.url or DUMMY_URL
|
args.url = args.url or DUMMY_URL
|
||||||
|
|
||||||
if not any((args.direct, args.url, args.logFile, args.bulkFile, args.googleDork, args.configFile, args.requestFile, args.updateAll, args.smokeTest, args.liveTest, args.wizard, args.dependencies, args.purgeOutput, args.sitemapUrl)):
|
if not any((args.direct, args.url, args.logFile, args.bulkFile, args.googleDork, args.configFile, args.requestFile, args.updateAll, args.smokeTest, args.liveTest, args.wizard, args.dependencies, args.purge, args.sitemapUrl)):
|
||||||
errMsg = "missing a mandatory option (-d, -u, -l, -m, -r, -g, -c, -x, --wizard, --update, --purge-output or --dependencies), "
|
errMsg = "missing a mandatory option (-d, -u, -l, -m, -r, -g, -c, -x, --wizard, --update, --purge-output or --dependencies), "
|
||||||
errMsg += "use -h for basic or -hh for advanced help\n"
|
errMsg += "use -h for basic or -hh for advanced help\n"
|
||||||
parser.error(errMsg)
|
parser.error(errMsg)
|
||||||
|
|||||||
@@ -13,7 +13,6 @@ from lib.core.data import kb
|
|||||||
from lib.core.data import paths
|
from lib.core.data import paths
|
||||||
from lib.parse.handler import FingerprintHandler
|
from lib.parse.handler import FingerprintHandler
|
||||||
|
|
||||||
|
|
||||||
def headersParser(headers):
|
def headersParser(headers):
|
||||||
"""
|
"""
|
||||||
This function calls a class that parses the input HTTP headers to
|
This function calls a class that parses the input HTTP headers to
|
||||||
|
|||||||
@@ -9,6 +9,7 @@ import re
|
|||||||
|
|
||||||
from xml.sax.handler import ContentHandler
|
from xml.sax.handler import ContentHandler
|
||||||
|
|
||||||
|
from lib.core.common import urldecode
|
||||||
from lib.core.common import parseXmlFile
|
from lib.core.common import parseXmlFile
|
||||||
from lib.core.data import kb
|
from lib.core.data import kb
|
||||||
from lib.core.data import paths
|
from lib.core.data import paths
|
||||||
@@ -26,6 +27,7 @@ class HTMLHandler(ContentHandler):
|
|||||||
self._dbms = None
|
self._dbms = None
|
||||||
self._page = (page or "")
|
self._page = (page or "")
|
||||||
self._lower_page = self._page.lower()
|
self._lower_page = self._page.lower()
|
||||||
|
self._urldecoded_page = urldecode(self._page)
|
||||||
|
|
||||||
self.dbms = None
|
self.dbms = None
|
||||||
|
|
||||||
@@ -47,7 +49,7 @@ class HTMLHandler(ContentHandler):
|
|||||||
keywords = sorted(keywords, key=len)
|
keywords = sorted(keywords, key=len)
|
||||||
kb.cache.regex[regexp] = keywords[-1].lower()
|
kb.cache.regex[regexp] = keywords[-1].lower()
|
||||||
|
|
||||||
if kb.cache.regex[regexp] in self._lower_page and re.search(regexp, self._page, re.I):
|
if kb.cache.regex[regexp] in self._lower_page and re.search(regexp, self._urldecoded_page, re.I):
|
||||||
self.dbms = self._dbms
|
self.dbms = self._dbms
|
||||||
self._markAsErrorPage()
|
self._markAsErrorPage()
|
||||||
|
|
||||||
|
|||||||
@@ -36,7 +36,7 @@ def cleanupVals(text, tag):
|
|||||||
return text
|
return text
|
||||||
|
|
||||||
def parseXmlNode(node):
|
def parseXmlNode(node):
|
||||||
for element in node.getiterator('boundary'):
|
for element in node.getiterator("boundary"):
|
||||||
boundary = AttribDict()
|
boundary = AttribDict()
|
||||||
|
|
||||||
for child in element.getchildren():
|
for child in element.getchildren():
|
||||||
@@ -48,7 +48,7 @@ def parseXmlNode(node):
|
|||||||
|
|
||||||
conf.boundaries.append(boundary)
|
conf.boundaries.append(boundary)
|
||||||
|
|
||||||
for element in node.getiterator('test'):
|
for element in node.getiterator("test"):
|
||||||
test = AttribDict()
|
test = AttribDict()
|
||||||
|
|
||||||
for child in element.getchildren():
|
for child in element.getchildren():
|
||||||
|
|||||||
@@ -35,7 +35,6 @@ from lib.core.enums import PLACE
|
|||||||
from lib.core.exception import SqlmapCompressionException
|
from lib.core.exception import SqlmapCompressionException
|
||||||
from lib.core.settings import BLOCKED_IP_REGEX
|
from lib.core.settings import BLOCKED_IP_REGEX
|
||||||
from lib.core.settings import DEFAULT_COOKIE_DELIMITER
|
from lib.core.settings import DEFAULT_COOKIE_DELIMITER
|
||||||
from lib.core.settings import DEV_EMAIL_ADDRESS
|
|
||||||
from lib.core.settings import EVENTVALIDATION_REGEX
|
from lib.core.settings import EVENTVALIDATION_REGEX
|
||||||
from lib.core.settings import MAX_CONNECTION_TOTAL_SIZE
|
from lib.core.settings import MAX_CONNECTION_TOTAL_SIZE
|
||||||
from lib.core.settings import META_CHARSET_REGEX
|
from lib.core.settings import META_CHARSET_REGEX
|
||||||
@@ -220,10 +219,6 @@ def checkCharEncoding(encoding, warn=True):
|
|||||||
try:
|
try:
|
||||||
codecs.lookup(encoding.encode(UNICODE_ENCODING) if isinstance(encoding, unicode) else encoding)
|
codecs.lookup(encoding.encode(UNICODE_ENCODING) if isinstance(encoding, unicode) else encoding)
|
||||||
except (LookupError, ValueError):
|
except (LookupError, ValueError):
|
||||||
if warn and ' ' not in encoding:
|
|
||||||
warnMsg = "unknown web page charset '%s'. " % encoding
|
|
||||||
warnMsg += "Please report by e-mail to '%s'" % DEV_EMAIL_ADDRESS
|
|
||||||
singleTimeLogMessage(warnMsg, logging.WARN, encoding)
|
|
||||||
encoding = None
|
encoding = None
|
||||||
|
|
||||||
if encoding:
|
if encoding:
|
||||||
@@ -390,7 +385,7 @@ def processResponse(page, responseHeaders, status=None):
|
|||||||
continue
|
continue
|
||||||
|
|
||||||
conf.paramDict[PLACE.POST][name] = value
|
conf.paramDict[PLACE.POST][name] = value
|
||||||
conf.parameters[PLACE.POST] = re.sub(r"(?i)(%s=)[^&]+" % re.escape(name), r"\g<1>%s" % re.escape(value), conf.parameters[PLACE.POST])
|
conf.parameters[PLACE.POST] = re.sub(r"(?i)(%s=)[^&]+" % re.escape(name), r"\g<1>%s" % value.replace('\\', r'\\'), conf.parameters[PLACE.POST])
|
||||||
|
|
||||||
if not kb.browserVerification and re.search(r"(?i)browser.?verification", page or ""):
|
if not kb.browserVerification and re.search(r"(?i)browser.?verification", page or ""):
|
||||||
kb.browserVerification = True
|
kb.browserVerification = True
|
||||||
|
|||||||
@@ -8,7 +8,6 @@ See the file 'LICENSE' for copying permission
|
|||||||
import binascii
|
import binascii
|
||||||
import compiler
|
import compiler
|
||||||
import httplib
|
import httplib
|
||||||
import json
|
|
||||||
import keyword
|
import keyword
|
||||||
import logging
|
import logging
|
||||||
import re
|
import re
|
||||||
@@ -120,7 +119,6 @@ from lib.request.methodrequest import MethodRequest
|
|||||||
from thirdparty.odict.odict import OrderedDict
|
from thirdparty.odict.odict import OrderedDict
|
||||||
from thirdparty.socks.socks import ProxyError
|
from thirdparty.socks.socks import ProxyError
|
||||||
|
|
||||||
|
|
||||||
class Connect(object):
|
class Connect(object):
|
||||||
"""
|
"""
|
||||||
This class defines methods used to perform HTTP requests
|
This class defines methods used to perform HTTP requests
|
||||||
@@ -409,8 +407,10 @@ class Connect(object):
|
|||||||
ws.close()
|
ws.close()
|
||||||
code = ws.status
|
code = ws.status
|
||||||
status = httplib.responses[code]
|
status = httplib.responses[code]
|
||||||
|
|
||||||
class _(dict):
|
class _(dict):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
responseHeaders = _(ws.getheaders())
|
responseHeaders = _(ws.getheaders())
|
||||||
responseHeaders.headers = ["%s: %s\r\n" % (_[0].capitalize(), _[1]) for _ in responseHeaders.items()]
|
responseHeaders.headers = ["%s: %s\r\n" % (_[0].capitalize(), _[1]) for _ in responseHeaders.items()]
|
||||||
|
|
||||||
@@ -645,13 +645,6 @@ class Connect(object):
|
|||||||
elif "forcibly closed" in tbMsg or "Connection is already closed" in tbMsg:
|
elif "forcibly closed" in tbMsg or "Connection is already closed" in tbMsg:
|
||||||
warnMsg = "connection was forcibly closed by the target URL"
|
warnMsg = "connection was forcibly closed by the target URL"
|
||||||
elif "timed out" in tbMsg:
|
elif "timed out" in tbMsg:
|
||||||
if not conf.disablePrecon:
|
|
||||||
singleTimeWarnMessage("turning off pre-connect mechanism because of connection time out(s)")
|
|
||||||
conf.disablePrecon = True
|
|
||||||
|
|
||||||
if kb.testMode and kb.testType not in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED):
|
|
||||||
kb.responseTimes.clear()
|
|
||||||
|
|
||||||
if kb.testMode and kb.testType not in (None, PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED):
|
if kb.testMode and kb.testType not in (None, PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED):
|
||||||
singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS/IDS) is dropping 'suspicious' requests")
|
singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS/IDS) is dropping 'suspicious' requests")
|
||||||
kb.droppingRequests = True
|
kb.droppingRequests = True
|
||||||
@@ -744,10 +737,10 @@ class Connect(object):
|
|||||||
if conn and getattr(conn, "redurl", None):
|
if conn and getattr(conn, "redurl", None):
|
||||||
_ = urlparse.urlsplit(conn.redurl)
|
_ = urlparse.urlsplit(conn.redurl)
|
||||||
_ = ("%s%s" % (_.path or "/", ("?%s" % _.query) if _.query else ""))
|
_ = ("%s%s" % (_.path or "/", ("?%s" % _.query) if _.query else ""))
|
||||||
requestMsg = re.sub(r"(\n[A-Z]+ ).+?( HTTP/\d)", "\g<1>%s\g<2>" % getUnicode(_).replace("\\", "\\\\"), requestMsg, 1)
|
requestMsg = re.sub(r"(\n[A-Z]+ ).+?( HTTP/\d)", r"\g<1>%s\g<2>" % getUnicode(_).replace("\\", "\\\\"), requestMsg, 1)
|
||||||
|
|
||||||
if kb.resendPostOnRedirect is False:
|
if kb.resendPostOnRedirect is False:
|
||||||
requestMsg = re.sub(r"(\[#\d+\]:\n)POST ", "\g<1>GET ", requestMsg)
|
requestMsg = re.sub(r"(\[#\d+\]:\n)POST ", r"\g<1>GET ", requestMsg)
|
||||||
requestMsg = re.sub(r"(?i)Content-length: \d+\n", "", requestMsg)
|
requestMsg = re.sub(r"(?i)Content-length: \d+\n", "", requestMsg)
|
||||||
requestMsg = re.sub(r"(?s)\n\n.+", "\n", requestMsg)
|
requestMsg = re.sub(r"(?s)\n\n.+", "\n", requestMsg)
|
||||||
|
|
||||||
@@ -1112,33 +1105,33 @@ class Connect(object):
|
|||||||
if kb.postHint in (POST_HINT.XML, POST_HINT.SOAP):
|
if kb.postHint in (POST_HINT.XML, POST_HINT.SOAP):
|
||||||
if re.search(r"<%s\b" % re.escape(name), post):
|
if re.search(r"<%s\b" % re.escape(name), post):
|
||||||
found = True
|
found = True
|
||||||
post = re.sub(r"(?s)(<%s\b[^>]*>)(.*?)(</%s)" % (re.escape(name), re.escape(name)), "\g<1>%s\g<3>" % value.replace('\\', r'\\'), post)
|
post = re.sub(r"(?s)(<%s\b[^>]*>)(.*?)(</%s)" % (re.escape(name), re.escape(name)), r"\g<1>%s\g<3>" % value.replace('\\', r'\\'), post)
|
||||||
elif re.search(r"\b%s>" % re.escape(name), post):
|
elif re.search(r"\b%s>" % re.escape(name), post):
|
||||||
found = True
|
found = True
|
||||||
post = re.sub(r"(?s)(\b%s>)(.*?)(</[^<]*\b%s>)" % (re.escape(name), re.escape(name)), "\g<1>%s\g<3>" % value.replace('\\', r'\\'), post)
|
post = re.sub(r"(?s)(\b%s>)(.*?)(</[^<]*\b%s>)" % (re.escape(name), re.escape(name)), r"\g<1>%s\g<3>" % value.replace('\\', r'\\'), post)
|
||||||
|
|
||||||
regex = r"\b(%s)\b([^\w]+)(\w+)" % re.escape(name)
|
regex = r"\b(%s)\b([^\w]+)(\w+)" % re.escape(name)
|
||||||
if not found and re.search(regex, (post or "")):
|
if not found and re.search(regex, (post or "")):
|
||||||
found = True
|
found = True
|
||||||
post = re.sub(regex, "\g<1>\g<2>%s" % value.replace('\\', r'\\'), post)
|
post = re.sub(regex, r"\g<1>\g<2>%s" % value.replace('\\', r'\\'), post)
|
||||||
|
|
||||||
regex = r"((\A|%s)%s=).+?(%s|\Z)" % (re.escape(delimiter), re.escape(name), re.escape(delimiter))
|
regex = r"((\A|%s)%s=).+?(%s|\Z)" % (re.escape(delimiter), re.escape(name), re.escape(delimiter))
|
||||||
if not found and re.search(regex, (post or "")):
|
if not found and re.search(regex, (post or "")):
|
||||||
found = True
|
found = True
|
||||||
post = re.sub(regex, "\g<1>%s\g<3>" % value.replace('\\', r'\\'), post)
|
post = re.sub(regex, r"\g<1>%s\g<3>" % value.replace('\\', r'\\'), post)
|
||||||
|
|
||||||
if re.search(regex, (get or "")):
|
if re.search(regex, (get or "")):
|
||||||
found = True
|
found = True
|
||||||
get = re.sub(regex, "\g<1>%s\g<3>" % value.replace('\\', r'\\'), get)
|
get = re.sub(regex, r"\g<1>%s\g<3>" % value.replace('\\', r'\\'), get)
|
||||||
|
|
||||||
if re.search(regex, (query or "")):
|
if re.search(regex, (query or "")):
|
||||||
found = True
|
found = True
|
||||||
uri = re.sub(regex.replace(r"\A", r"\?"), "\g<1>%s\g<3>" % value.replace('\\', r'\\'), uri)
|
uri = re.sub(regex.replace(r"\A", r"\?"), r"\g<1>%s\g<3>" % value.replace('\\', r'\\'), uri)
|
||||||
|
|
||||||
regex = r"((\A|%s)%s=).+?(%s|\Z)" % (re.escape(conf.cookieDel or DEFAULT_COOKIE_DELIMITER), re.escape(name), re.escape(conf.cookieDel or DEFAULT_COOKIE_DELIMITER))
|
regex = r"((\A|%s)%s=).+?(%s|\Z)" % (re.escape(conf.cookieDel or DEFAULT_COOKIE_DELIMITER), re.escape(name), re.escape(conf.cookieDel or DEFAULT_COOKIE_DELIMITER))
|
||||||
if re.search(regex, (cookie or "")):
|
if re.search(regex, (cookie or "")):
|
||||||
found = True
|
found = True
|
||||||
cookie = re.sub(regex, "\g<1>%s\g<3>" % value.replace('\\', r'\\'), cookie)
|
cookie = re.sub(regex, r"\g<1>%s\g<3>" % value.replace('\\', r'\\'), cookie)
|
||||||
|
|
||||||
if not found:
|
if not found:
|
||||||
if post is not None:
|
if post is not None:
|
||||||
@@ -1169,7 +1162,7 @@ class Connect(object):
|
|||||||
singleTimeWarnMessage(warnMsg)
|
singleTimeWarnMessage(warnMsg)
|
||||||
|
|
||||||
warnMsg = "[%s] [WARNING] %stime-based comparison requires " % (time.strftime("%X"), "(case) " if kb.responseTimeMode else "")
|
warnMsg = "[%s] [WARNING] %stime-based comparison requires " % (time.strftime("%X"), "(case) " if kb.responseTimeMode else "")
|
||||||
warnMsg += "larger statistical model, please wait"
|
warnMsg += "%s statistical model, please wait" % ("larger" if len(kb.responseTimes) == 1 else "reset of")
|
||||||
dataToStdout(warnMsg)
|
dataToStdout(warnMsg)
|
||||||
|
|
||||||
while len(kb.responseTimes[kb.responseTimeMode]) < MIN_TIME_RESPONSES:
|
while len(kb.responseTimes[kb.responseTimeMode]) < MIN_TIME_RESPONSES:
|
||||||
@@ -1242,8 +1235,10 @@ class Connect(object):
|
|||||||
warnMsg += "behavior in custom WAF/IPS/IDS solutions"
|
warnMsg += "behavior in custom WAF/IPS/IDS solutions"
|
||||||
singleTimeWarnMessage(warnMsg)
|
singleTimeWarnMessage(warnMsg)
|
||||||
|
|
||||||
if conf.secondOrder:
|
if conf.secondUrl:
|
||||||
page, headers, code = Connect.getPage(url=conf.secondOrder, cookie=cookie, ua=ua, silent=silent, auxHeaders=auxHeaders, response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True)
|
page, headers, code = Connect.getPage(url=conf.secondUrl, cookie=cookie, ua=ua, silent=silent, auxHeaders=auxHeaders, response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True)
|
||||||
|
elif kb.secondReq:
|
||||||
|
page, headers, code = Connect.getPage(url=kb.secondReq[0], post=kb.secondReq[2], method=kb.secondReq[1], cookie=kb.secondReq[3], silent=silent, auxHeaders=dict(auxHeaders, **dict(kb.secondReq[4])), response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True)
|
||||||
|
|
||||||
threadData.lastQueryDuration = calculateDeltaSeconds(start)
|
threadData.lastQueryDuration = calculateDeltaSeconds(start)
|
||||||
threadData.lastPage = page
|
threadData.lastPage = page
|
||||||
|
|||||||
@@ -12,6 +12,7 @@ import socket
|
|||||||
import urllib2
|
import urllib2
|
||||||
|
|
||||||
from lib.core.common import getSafeExString
|
from lib.core.common import getSafeExString
|
||||||
|
from lib.core.data import conf
|
||||||
from lib.core.data import kb
|
from lib.core.data import kb
|
||||||
from lib.core.data import logger
|
from lib.core.data import logger
|
||||||
from lib.core.exception import SqlmapConnectionException
|
from lib.core.exception import SqlmapConnectionException
|
||||||
@@ -48,7 +49,7 @@ class HTTPSConnection(httplib.HTTPSConnection):
|
|||||||
|
|
||||||
# Reference(s): https://docs.python.org/2/library/ssl.html#ssl.SSLContext
|
# Reference(s): https://docs.python.org/2/library/ssl.html#ssl.SSLContext
|
||||||
# https://www.mnot.net/blog/2014/12/27/python_2_and_tls_sni
|
# https://www.mnot.net/blog/2014/12/27/python_2_and_tls_sni
|
||||||
if re.search(r"\A[\d.]+\Z", self.host) is None and kb.tlsSNI.get(self.host) is not False and hasattr(ssl, "SSLContext"):
|
if re.search(r"\A[\d.]+\Z", self.host) is None and kb.tlsSNI.get(self.host) is not False and not any((conf.proxy, conf.tor)) and hasattr(ssl, "SSLContext"):
|
||||||
for protocol in filter(lambda _: _ >= ssl.PROTOCOL_TLSv1, _protocols):
|
for protocol in filter(lambda _: _ >= ssl.PROTOCOL_TLSv1, _protocols):
|
||||||
try:
|
try:
|
||||||
sock = create_sock()
|
sock = create_sock()
|
||||||
|
|||||||
@@ -77,6 +77,9 @@ def _goInference(payload, expression, charsetType=None, firstChar=None, lastChar
|
|||||||
|
|
||||||
value = _goDns(payload, expression)
|
value = _goDns(payload, expression)
|
||||||
|
|
||||||
|
if payload is None:
|
||||||
|
return None
|
||||||
|
|
||||||
if value is not None:
|
if value is not None:
|
||||||
return value
|
return value
|
||||||
|
|
||||||
@@ -437,7 +440,8 @@ def getValue(expression, blind=True, union=True, error=True, time=True, fromUser
|
|||||||
found = (value is not None) or (value is None and expectingNone) or count >= MAX_TECHNIQUES_PER_VALUE
|
found = (value is not None) or (value is None and expectingNone) or count >= MAX_TECHNIQUES_PER_VALUE
|
||||||
|
|
||||||
if time and (isTechniqueAvailable(PAYLOAD.TECHNIQUE.TIME) or isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED)) and not found:
|
if time and (isTechniqueAvailable(PAYLOAD.TECHNIQUE.TIME) or isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED)) and not found:
|
||||||
kb.responseTimeMode = re.sub(r"(?i)[^a-z]", "", re.sub(r"'[^']+'", "", re.sub(r"(?i)(\w+)\(.+\)", r"\g<1>", expression))) if re.search(r"(?i)SELECT.+FROM", expression) else None
|
match = re.search(r"\bFROM\b ([^ ]+).+ORDER BY ([^ ]+)", expression)
|
||||||
|
kb.responseTimeMode = "%s|%s" % (match.group(1), match.group(2)) if match else None
|
||||||
|
|
||||||
if isTechniqueAvailable(PAYLOAD.TECHNIQUE.TIME):
|
if isTechniqueAvailable(PAYLOAD.TECHNIQUE.TIME):
|
||||||
kb.technique = PAYLOAD.TECHNIQUE.TIME
|
kb.technique = PAYLOAD.TECHNIQUE.TIME
|
||||||
|
|||||||
@@ -32,7 +32,7 @@ class HTTPRangeHandler(urllib2.BaseHandler):
|
|||||||
urllib2.install_opener(opener)
|
urllib2.install_opener(opener)
|
||||||
|
|
||||||
# create Request and set Range header
|
# create Request and set Range header
|
||||||
req = urllib2.Request('http://www.python.org/')
|
req = urllib2.Request('https://www.python.org/')
|
||||||
req.header['Range'] = 'bytes=30-50'
|
req.header['Range'] = 'bytes=30-50'
|
||||||
f = urllib2.urlopen(req)
|
f = urllib2.urlopen(req)
|
||||||
"""
|
"""
|
||||||
|
|||||||
@@ -5,7 +5,6 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
|||||||
See the file 'LICENSE' for copying permission
|
See the file 'LICENSE' for copying permission
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import re
|
|
||||||
import time
|
import time
|
||||||
import types
|
import types
|
||||||
import urllib2
|
import urllib2
|
||||||
@@ -124,12 +123,21 @@ class SmartRedirectHandler(urllib2.HTTPRedirectHandler):
|
|||||||
|
|
||||||
req.headers[HTTP_HEADER.HOST] = getHostHeader(redurl)
|
req.headers[HTTP_HEADER.HOST] = getHostHeader(redurl)
|
||||||
if headers and HTTP_HEADER.SET_COOKIE in headers:
|
if headers and HTTP_HEADER.SET_COOKIE in headers:
|
||||||
|
cookies = dict()
|
||||||
delimiter = conf.cookieDel or DEFAULT_COOKIE_DELIMITER
|
delimiter = conf.cookieDel or DEFAULT_COOKIE_DELIMITER
|
||||||
_ = headers[HTTP_HEADER.SET_COOKIE].split(delimiter)[0]
|
last = None
|
||||||
if HTTP_HEADER.COOKIE not in req.headers:
|
|
||||||
req.headers[HTTP_HEADER.COOKIE] = _
|
for part in req.headers.get(HTTP_HEADER.COOKIE, "").split(delimiter) + headers.getheaders(HTTP_HEADER.SET_COOKIE):
|
||||||
else:
|
if '=' in part:
|
||||||
req.headers[HTTP_HEADER.COOKIE] = re.sub(r"%s{2,}" % delimiter, delimiter, ("%s%s%s" % (re.sub(r"\b%s=[^%s]*%s?" % (re.escape(_.split('=')[0]), delimiter, delimiter), "", req.headers[HTTP_HEADER.COOKIE]), delimiter, _)).strip(delimiter))
|
part = part.strip()
|
||||||
|
key, value = part.split('=', 1)
|
||||||
|
cookies[key] = value
|
||||||
|
last = key
|
||||||
|
elif last:
|
||||||
|
cookies[last] += "%s%s" % (delimiter, part)
|
||||||
|
|
||||||
|
req.headers[HTTP_HEADER.COOKIE] = delimiter.join("%s=%s" % (key, cookies[key]) for key in cookies)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
result = urllib2.HTTPRedirectHandler.http_error_302(self, req, fp, code, msg, headers)
|
result = urllib2.HTTPRedirectHandler.http_error_302(self, req, fp, code, msg, headers)
|
||||||
except urllib2.HTTPError, e:
|
except urllib2.HTTPError, e:
|
||||||
|
|||||||
@@ -27,7 +27,6 @@ from lib.takeover.udf import UDF
|
|||||||
from lib.takeover.web import Web
|
from lib.takeover.web import Web
|
||||||
from lib.takeover.xp_cmdshell import XP_cmdshell
|
from lib.takeover.xp_cmdshell import XP_cmdshell
|
||||||
|
|
||||||
|
|
||||||
class Abstraction(Web, UDF, XP_cmdshell):
|
class Abstraction(Web, UDF, XP_cmdshell):
|
||||||
"""
|
"""
|
||||||
This class defines an abstraction layer for OS takeover functionalities
|
This class defines an abstraction layer for OS takeover functionalities
|
||||||
|
|||||||
@@ -53,7 +53,6 @@ from lib.core.settings import VIEWSTATE_REGEX
|
|||||||
from lib.request.connect import Connect as Request
|
from lib.request.connect import Connect as Request
|
||||||
from thirdparty.oset.pyoset import oset
|
from thirdparty.oset.pyoset import oset
|
||||||
|
|
||||||
|
|
||||||
class Web:
|
class Web:
|
||||||
"""
|
"""
|
||||||
This class defines web-oriented OS takeover functionalities for
|
This class defines web-oriented OS takeover functionalities for
|
||||||
@@ -220,7 +219,7 @@ class Web:
|
|||||||
finally:
|
finally:
|
||||||
been.add(url)
|
been.add(url)
|
||||||
|
|
||||||
url = re.sub(r"(\.\w+)\Z", "~\g<1>", conf.url)
|
url = re.sub(r"(\.\w+)\Z", r"~\g<1>", conf.url)
|
||||||
if url not in been:
|
if url not in been:
|
||||||
try:
|
try:
|
||||||
page, _, _ = Request.getPage(url=url, raise404=False, silent=True)
|
page, _, _ = Request.getPage(url=url, raise404=False, silent=True)
|
||||||
@@ -232,7 +231,7 @@ class Web:
|
|||||||
|
|
||||||
for place in (PLACE.GET, PLACE.POST):
|
for place in (PLACE.GET, PLACE.POST):
|
||||||
if place in conf.parameters:
|
if place in conf.parameters:
|
||||||
value = re.sub(r"(\A|&)(\w+)=", "\g<2>[]=", conf.parameters[place])
|
value = re.sub(r"(\A|&)(\w+)=", r"\g<2>[]=", conf.parameters[place])
|
||||||
if "[]" in value:
|
if "[]" in value:
|
||||||
page, headers, _ = Request.queryPage(value=value, place=place, content=True, raise404=False, silent=True, noteResponseTime=False)
|
page, headers, _ = Request.queryPage(value=value, place=place, content=True, raise404=False, silent=True, noteResponseTime=False)
|
||||||
parseFilePaths(page)
|
parseFilePaths(page)
|
||||||
@@ -244,12 +243,12 @@ class Web:
|
|||||||
cookie = headers[HTTP_HEADER.SET_COOKIE]
|
cookie = headers[HTTP_HEADER.SET_COOKIE]
|
||||||
|
|
||||||
if cookie:
|
if cookie:
|
||||||
value = re.sub(r"(\A|;)(\w+)=[^;]*", "\g<2>=AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", cookie)
|
value = re.sub(r"(\A|;)(\w+)=[^;]*", r"\g<2>=AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", cookie)
|
||||||
if value != cookie:
|
if value != cookie:
|
||||||
page, _, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False)
|
page, _, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False)
|
||||||
parseFilePaths(page)
|
parseFilePaths(page)
|
||||||
|
|
||||||
value = re.sub(r"(\A|;)(\w+)=[^;]*", "\g<2>=", cookie)
|
value = re.sub(r"(\A|;)(\w+)=[^;]*", r"\g<2>=", cookie)
|
||||||
if value != cookie:
|
if value != cookie:
|
||||||
page, _, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False)
|
page, _, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False)
|
||||||
parseFilePaths(page)
|
parseFilePaths(page)
|
||||||
|
|||||||
@@ -69,6 +69,9 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
|||||||
finalValue = None
|
finalValue = None
|
||||||
retrievedLength = 0
|
retrievedLength = 0
|
||||||
|
|
||||||
|
if payload is None:
|
||||||
|
return 0, None
|
||||||
|
|
||||||
if charsetType is None and conf.charset:
|
if charsetType is None and conf.charset:
|
||||||
asciiTbl = sorted(set(ord(_) for _ in conf.charset))
|
asciiTbl = sorted(set(ord(_) for _ in conf.charset))
|
||||||
else:
|
else:
|
||||||
@@ -187,7 +190,7 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
|||||||
with hintlock:
|
with hintlock:
|
||||||
hintValue = kb.hintValue
|
hintValue = kb.hintValue
|
||||||
|
|
||||||
if hintValue is not None and len(hintValue) >= idx:
|
if payload is not None and hintValue is not None and len(hintValue) >= idx:
|
||||||
if Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.ACCESS, DBMS.MAXDB, DBMS.DB2):
|
if Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.ACCESS, DBMS.MAXDB, DBMS.DB2):
|
||||||
posValue = hintValue[idx - 1]
|
posValue = hintValue[idx - 1]
|
||||||
else:
|
else:
|
||||||
|
|||||||
@@ -14,7 +14,6 @@ from lib.core.dicts import FROM_DUMMY_TABLE
|
|||||||
from lib.core.exception import SqlmapNotVulnerableException
|
from lib.core.exception import SqlmapNotVulnerableException
|
||||||
from lib.techniques.dns.use import dnsUse
|
from lib.techniques.dns.use import dnsUse
|
||||||
|
|
||||||
|
|
||||||
def dnsTest(payload):
|
def dnsTest(payload):
|
||||||
logger.info("testing for data retrieval through DNS channel")
|
logger.info("testing for data retrieval through DNS channel")
|
||||||
|
|
||||||
|
|||||||
@@ -33,7 +33,6 @@ from lib.core.settings import PARTIAL_VALUE_MARKER
|
|||||||
from lib.core.unescaper import unescaper
|
from lib.core.unescaper import unescaper
|
||||||
from lib.request.connect import Connect as Request
|
from lib.request.connect import Connect as Request
|
||||||
|
|
||||||
|
|
||||||
def dnsUse(payload, expression):
|
def dnsUse(payload, expression):
|
||||||
"""
|
"""
|
||||||
Retrieve the output of a SQL query taking advantage of the DNS
|
Retrieve the output of a SQL query taking advantage of the DNS
|
||||||
@@ -84,7 +83,7 @@ def dnsUse(payload, expression):
|
|||||||
_ = conf.dnsServer.pop(prefix, suffix)
|
_ = conf.dnsServer.pop(prefix, suffix)
|
||||||
|
|
||||||
if _:
|
if _:
|
||||||
_ = extractRegexResult("%s\.(?P<result>.+)\.%s" % (prefix, suffix), _, re.I)
|
_ = extractRegexResult(r"%s\.(?P<result>.+)\.%s" % (prefix, suffix), _, re.I)
|
||||||
_ = decodeHexValue(_)
|
_ = decodeHexValue(_)
|
||||||
output = (output or "") + _
|
output = (output or "") + _
|
||||||
offset += len(_)
|
offset += len(_)
|
||||||
|
|||||||
@@ -414,7 +414,7 @@ def errorUse(expression, dump=False):
|
|||||||
break
|
break
|
||||||
|
|
||||||
if output and isListLike(output) and len(output) == 1:
|
if output and isListLike(output) and len(output) == 1:
|
||||||
output = output[0]
|
output = unArrayizeValue(output)
|
||||||
|
|
||||||
with kb.locks.value:
|
with kb.locks.value:
|
||||||
index = None
|
index = None
|
||||||
@@ -446,7 +446,7 @@ def errorUse(expression, dump=False):
|
|||||||
value = _errorFields(expression, expressionFields, expressionFieldsList)
|
value = _errorFields(expression, expressionFields, expressionFieldsList)
|
||||||
|
|
||||||
if value and isListLike(value) and len(value) == 1 and isinstance(value[0], basestring):
|
if value and isListLike(value) and len(value) == 1 and isinstance(value[0], basestring):
|
||||||
value = value[0]
|
value = unArrayizeValue(value)
|
||||||
|
|
||||||
duration = calculateDeltaSeconds(start)
|
duration = calculateDeltaSeconds(start)
|
||||||
|
|
||||||
|
|||||||
@@ -90,8 +90,8 @@ def _findUnionCharCount(comment, place, parameter, value, prefix, suffix, where=
|
|||||||
kb.errorIsNone = False
|
kb.errorIsNone = False
|
||||||
lowerCount, upperCount = conf.uColsStart, conf.uColsStop
|
lowerCount, upperCount = conf.uColsStart, conf.uColsStop
|
||||||
|
|
||||||
if lowerCount == 1 or conf.uCols:
|
if kb.orderByColumns is None and (lowerCount == 1 or conf.uCols): # ORDER BY is not bullet-proof
|
||||||
found = kb.orderByColumns or (_orderByTechnique(lowerCount, upperCount) if conf.uCols else _orderByTechnique())
|
found = _orderByTechnique(lowerCount, upperCount) if conf.uCols else _orderByTechnique()
|
||||||
if found:
|
if found:
|
||||||
kb.orderByColumns = found
|
kb.orderByColumns = found
|
||||||
infoMsg = "target URL appears to have %d column%s in query" % (found, 's' if found > 1 else "")
|
infoMsg = "target URL appears to have %d column%s in query" % (found, 's' if found > 1 else "")
|
||||||
@@ -116,10 +116,10 @@ def _findUnionCharCount(comment, place, parameter, value, prefix, suffix, where=
|
|||||||
items.append((count, ratio))
|
items.append((count, ratio))
|
||||||
|
|
||||||
if not isNullValue(kb.uChar):
|
if not isNullValue(kb.uChar):
|
||||||
for regex in (kb.uChar, r'>\s*%s\s*<' % kb.uChar):
|
for regex in (kb.uChar.strip("'"), r'>\s*%s\s*<' % kb.uChar.strip("'")):
|
||||||
contains = tuple((count, re.search(regex, _ or "", re.IGNORECASE) is not None) for count, _ in pages.items())
|
contains = [count for count, content in pages.items() if re.search(regex, content or "", re.IGNORECASE) is not None]
|
||||||
if len(filter(lambda _: _[1], contains)) == 1:
|
if len(contains) == 1:
|
||||||
retVal = filter(lambda _: _[1], contains)[0][0]
|
retVal = contains[0]
|
||||||
break
|
break
|
||||||
|
|
||||||
if not retVal:
|
if not retVal:
|
||||||
@@ -267,6 +267,8 @@ def _unionTestByCharBruteforce(comment, place, parameter, value, prefix, suffix)
|
|||||||
|
|
||||||
validPayload = None
|
validPayload = None
|
||||||
vector = None
|
vector = None
|
||||||
|
orderBy = kb.orderByColumns
|
||||||
|
uChars = (conf.uChar, kb.uChar)
|
||||||
|
|
||||||
# In case that user explicitly stated number of columns affected
|
# In case that user explicitly stated number of columns affected
|
||||||
if conf.uColsStop == conf.uColsStart:
|
if conf.uColsStop == conf.uColsStart:
|
||||||
@@ -301,6 +303,10 @@ def _unionTestByCharBruteforce(comment, place, parameter, value, prefix, suffix)
|
|||||||
if not all((validPayload, vector)) and not warnMsg.endswith("consider "):
|
if not all((validPayload, vector)) and not warnMsg.endswith("consider "):
|
||||||
singleTimeWarnMessage(warnMsg)
|
singleTimeWarnMessage(warnMsg)
|
||||||
|
|
||||||
|
if count and orderBy is None and kb.orderByColumns is not None: # discard ORDER BY results (not usable - e.g. maybe invalid altogether)
|
||||||
|
conf.uChar, kb.uChar = uChars
|
||||||
|
validPayload, vector = _unionTestByCharBruteforce(comment, place, parameter, value, prefix, suffix)
|
||||||
|
|
||||||
return validPayload, vector
|
return validPayload, vector
|
||||||
|
|
||||||
def unionTest(comment, place, parameter, value, prefix, suffix):
|
def unionTest(comment, place, parameter, value, prefix, suffix):
|
||||||
|
|||||||
@@ -33,9 +33,10 @@ from lib.core.data import paths
|
|||||||
from lib.core.data import logger
|
from lib.core.data import logger
|
||||||
from lib.core.datatype import AttribDict
|
from lib.core.datatype import AttribDict
|
||||||
from lib.core.defaults import _defaults
|
from lib.core.defaults import _defaults
|
||||||
|
from lib.core.dicts import PART_RUN_CONTENT_TYPES
|
||||||
|
from lib.core.enums import AUTOCOMPLETE_TYPE
|
||||||
from lib.core.enums import CONTENT_STATUS
|
from lib.core.enums import CONTENT_STATUS
|
||||||
from lib.core.enums import MKSTEMP_PREFIX
|
from lib.core.enums import MKSTEMP_PREFIX
|
||||||
from lib.core.enums import PART_RUN_CONTENT_TYPES
|
|
||||||
from lib.core.exception import SqlmapConnectionException
|
from lib.core.exception import SqlmapConnectionException
|
||||||
from lib.core.log import LOGGER_HANDLER
|
from lib.core.log import LOGGER_HANDLER
|
||||||
from lib.core.optiondict import optDict
|
from lib.core.optiondict import optDict
|
||||||
@@ -43,6 +44,7 @@ from lib.core.settings import RESTAPI_DEFAULT_ADAPTER
|
|||||||
from lib.core.settings import IS_WIN
|
from lib.core.settings import IS_WIN
|
||||||
from lib.core.settings import RESTAPI_DEFAULT_ADDRESS
|
from lib.core.settings import RESTAPI_DEFAULT_ADDRESS
|
||||||
from lib.core.settings import RESTAPI_DEFAULT_PORT
|
from lib.core.settings import RESTAPI_DEFAULT_PORT
|
||||||
|
from lib.core.shell import autoCompletion
|
||||||
from lib.core.subprocessng import Popen
|
from lib.core.subprocessng import Popen
|
||||||
from lib.parse.cmdline import cmdLineParser
|
from lib.parse.cmdline import cmdLineParser
|
||||||
from thirdparty.bottle.bottle import error as return_error
|
from thirdparty.bottle.bottle import error as return_error
|
||||||
@@ -104,9 +106,7 @@ class Database(object):
|
|||||||
|
|
||||||
def init(self):
|
def init(self):
|
||||||
self.execute("CREATE TABLE logs(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, time TEXT, level TEXT, message TEXT)")
|
self.execute("CREATE TABLE logs(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, time TEXT, level TEXT, message TEXT)")
|
||||||
|
|
||||||
self.execute("CREATE TABLE data(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, status INTEGER, content_type INTEGER, value TEXT)")
|
self.execute("CREATE TABLE data(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, status INTEGER, content_type INTEGER, value TEXT)")
|
||||||
|
|
||||||
self.execute("CREATE TABLE errors(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, error TEXT)")
|
self.execute("CREATE TABLE errors(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, error TEXT)")
|
||||||
|
|
||||||
class Task(object):
|
class Task(object):
|
||||||
@@ -161,6 +161,8 @@ class Task(object):
|
|||||||
self.process = Popen(["python", "sqlmap.py", "--api", "-c", configFile], shell=False, close_fds=not IS_WIN)
|
self.process = Popen(["python", "sqlmap.py", "--api", "-c", configFile], shell=False, close_fds=not IS_WIN)
|
||||||
elif os.path.exists(os.path.join(os.getcwd(), "sqlmap.py")):
|
elif os.path.exists(os.path.join(os.getcwd(), "sqlmap.py")):
|
||||||
self.process = Popen(["python", "sqlmap.py", "--api", "-c", configFile], shell=False, cwd=os.getcwd(), close_fds=not IS_WIN)
|
self.process = Popen(["python", "sqlmap.py", "--api", "-c", configFile], shell=False, cwd=os.getcwd(), close_fds=not IS_WIN)
|
||||||
|
elif os.path.exists(os.path.join(os.path.abspath(os.path.dirname(sys.argv[0])), "sqlmap.py")):
|
||||||
|
self.process = Popen(["python", "sqlmap.py", "--api", "-c", configFile], shell=False, cwd=os.path.join(os.path.abspath(os.path.dirname(sys.argv[0]))), close_fds=not IS_WIN)
|
||||||
else:
|
else:
|
||||||
self.process = Popen(["sqlmap", "--api", "-c", configFile], shell=False, close_fds=not IS_WIN)
|
self.process = Popen(["sqlmap", "--api", "-c", configFile], shell=False, close_fds=not IS_WIN)
|
||||||
|
|
||||||
@@ -199,7 +201,6 @@ class Task(object):
|
|||||||
def engine_has_terminated(self):
|
def engine_has_terminated(self):
|
||||||
return isinstance(self.engine_get_returncode(), int)
|
return isinstance(self.engine_get_returncode(), int)
|
||||||
|
|
||||||
|
|
||||||
# Wrapper functions for sqlmap engine
|
# Wrapper functions for sqlmap engine
|
||||||
class StdDbOut(object):
|
class StdDbOut(object):
|
||||||
def __init__(self, taskid, messagetype="stdout"):
|
def __init__(self, taskid, messagetype="stdout"):
|
||||||
@@ -499,9 +500,7 @@ def scan_stop(taskid):
|
|||||||
Stop a scan
|
Stop a scan
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if (taskid not in DataStore.tasks or
|
if (taskid not in DataStore.tasks or DataStore.tasks[taskid].engine_process() is None or DataStore.tasks[taskid].engine_has_terminated()):
|
||||||
DataStore.tasks[taskid].engine_process() is None or
|
|
||||||
DataStore.tasks[taskid].engine_has_terminated()):
|
|
||||||
logger.warning("[%s] Invalid task ID provided to scan_stop()" % taskid)
|
logger.warning("[%s] Invalid task ID provided to scan_stop()" % taskid)
|
||||||
return jsonize({"success": False, "message": "Invalid task ID"})
|
return jsonize({"success": False, "message": "Invalid task ID"})
|
||||||
|
|
||||||
@@ -516,9 +515,7 @@ def scan_kill(taskid):
|
|||||||
Kill a scan
|
Kill a scan
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if (taskid not in DataStore.tasks or
|
if (taskid not in DataStore.tasks or DataStore.tasks[taskid].engine_process() is None or DataStore.tasks[taskid].engine_has_terminated()):
|
||||||
DataStore.tasks[taskid].engine_process() is None or
|
|
||||||
DataStore.tasks[taskid].engine_has_terminated()):
|
|
||||||
logger.warning("[%s] Invalid task ID provided to scan_kill()" % taskid)
|
logger.warning("[%s] Invalid task ID provided to scan_kill()" % taskid)
|
||||||
return jsonize({"success": False, "message": "Invalid task ID"})
|
return jsonize({"success": False, "message": "Invalid task ID"})
|
||||||
|
|
||||||
@@ -573,7 +570,6 @@ def scan_data(taskid):
|
|||||||
logger.debug("[%s] Retrieved scan data and error messages" % taskid)
|
logger.debug("[%s] Retrieved scan data and error messages" % taskid)
|
||||||
return jsonize({"success": True, "data": json_data_message, "error": json_errors_message})
|
return jsonize({"success": True, "data": json_data_message, "error": json_errors_message})
|
||||||
|
|
||||||
|
|
||||||
# Functions to handle scans' logs
|
# Functions to handle scans' logs
|
||||||
@get("/scan/<taskid>/log/<start>/<end>")
|
@get("/scan/<taskid>/log/<start>/<end>")
|
||||||
def scan_log_limited(taskid, start, end):
|
def scan_log_limited(taskid, start, end):
|
||||||
@@ -601,7 +597,6 @@ def scan_log_limited(taskid, start, end):
|
|||||||
logger.debug("[%s] Retrieved scan log messages subset" % taskid)
|
logger.debug("[%s] Retrieved scan log messages subset" % taskid)
|
||||||
return jsonize({"success": True, "log": json_log_messages})
|
return jsonize({"success": True, "log": json_log_messages})
|
||||||
|
|
||||||
|
|
||||||
@get("/scan/<taskid>/log")
|
@get("/scan/<taskid>/log")
|
||||||
def scan_log(taskid):
|
def scan_log(taskid):
|
||||||
"""
|
"""
|
||||||
@@ -621,7 +616,6 @@ def scan_log(taskid):
|
|||||||
logger.debug("[%s] Retrieved scan log messages" % taskid)
|
logger.debug("[%s] Retrieved scan log messages" % taskid)
|
||||||
return jsonize({"success": True, "log": json_log_messages})
|
return jsonize({"success": True, "log": json_log_messages})
|
||||||
|
|
||||||
|
|
||||||
# Function to handle files inside the output directory
|
# Function to handle files inside the output directory
|
||||||
@get("/download/<taskid>/<target>/<filename:path>")
|
@get("/download/<taskid>/<target>/<filename:path>")
|
||||||
def download(taskid, target, filename):
|
def download(taskid, target, filename):
|
||||||
@@ -648,7 +642,6 @@ def download(taskid, target, filename):
|
|||||||
logger.warning("[%s] File does not exist %s" % (taskid, target))
|
logger.warning("[%s] File does not exist %s" % (taskid, target))
|
||||||
return jsonize({"success": False, "message": "File does not exist"})
|
return jsonize({"success": False, "message": "File does not exist"})
|
||||||
|
|
||||||
|
|
||||||
def server(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, adapter=RESTAPI_DEFAULT_ADAPTER, username=None, password=None):
|
def server(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, adapter=RESTAPI_DEFAULT_ADAPTER, username=None, password=None):
|
||||||
"""
|
"""
|
||||||
REST-JSON API server
|
REST-JSON API server
|
||||||
@@ -696,7 +689,7 @@ def server(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, adapter=REST
|
|||||||
except ImportError:
|
except ImportError:
|
||||||
if adapter.lower() not in server_names:
|
if adapter.lower() not in server_names:
|
||||||
errMsg = "Adapter '%s' is unknown. " % adapter
|
errMsg = "Adapter '%s' is unknown. " % adapter
|
||||||
errMsg += "(Note: available adapters '%s')" % ', '.join(sorted(server_names.keys()))
|
errMsg += "List of supported adapters: %s" % ', '.join(sorted(server_names.keys()))
|
||||||
else:
|
else:
|
||||||
errMsg = "Server support for adapter '%s' is not installed on this system " % adapter
|
errMsg = "Server support for adapter '%s' is not installed on this system " % adapter
|
||||||
errMsg += "(Note: you can try to install it with 'sudo apt-get install python-%s' or 'sudo pip install %s')" % (adapter, adapter)
|
errMsg += "(Note: you can try to install it with 'sudo apt-get install python-%s' or 'sudo pip install %s')" % (adapter, adapter)
|
||||||
@@ -750,6 +743,9 @@ def client(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, username=Non
|
|||||||
logger.critical(errMsg)
|
logger.critical(errMsg)
|
||||||
return
|
return
|
||||||
|
|
||||||
|
commands = ("help", "new", "use", "data", "log", "status", "option", "stop", "kill", "list", "flush", "exit", "bye", "quit")
|
||||||
|
autoCompletion(AUTOCOMPLETE_TYPE.API, commands=commands)
|
||||||
|
|
||||||
taskid = None
|
taskid = None
|
||||||
logger.info("Type 'help' or '?' for list of available commands")
|
logger.info("Type 'help' or '?' for list of available commands")
|
||||||
|
|
||||||
|
|||||||
@@ -22,7 +22,6 @@ class _Getch(object):
|
|||||||
def __call__(self):
|
def __call__(self):
|
||||||
return self.impl()
|
return self.impl()
|
||||||
|
|
||||||
|
|
||||||
class _GetchUnix(object):
|
class _GetchUnix(object):
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
__import__("tty")
|
__import__("tty")
|
||||||
@@ -41,7 +40,6 @@ class _GetchUnix(object):
|
|||||||
termios.tcsetattr(fd, termios.TCSADRAIN, old_settings)
|
termios.tcsetattr(fd, termios.TCSADRAIN, old_settings)
|
||||||
return ch
|
return ch
|
||||||
|
|
||||||
|
|
||||||
class _GetchWindows(object):
|
class _GetchWindows(object):
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
__import__("msvcrt")
|
__import__("msvcrt")
|
||||||
@@ -50,7 +48,6 @@ class _GetchWindows(object):
|
|||||||
import msvcrt
|
import msvcrt
|
||||||
return msvcrt.getch()
|
return msvcrt.getch()
|
||||||
|
|
||||||
|
|
||||||
class _GetchMacCarbon(object):
|
class _GetchMacCarbon(object):
|
||||||
"""
|
"""
|
||||||
A function which returns the current ASCII key that is down;
|
A function which returns the current ASCII key that is down;
|
||||||
@@ -79,5 +76,4 @@ class _GetchMacCarbon(object):
|
|||||||
(what, msg, when, where, mod) = Carbon.Evt.GetNextEvent(0x0008)[1]
|
(what, msg, when, where, mod) = Carbon.Evt.GetNextEvent(0x0008)[1]
|
||||||
return chr(msg & 0x000000FF)
|
return chr(msg & 0x000000FF)
|
||||||
|
|
||||||
|
|
||||||
getch = _Getch()
|
getch = _Getch()
|
||||||
|
|||||||
@@ -36,7 +36,6 @@ from lib.core.settings import UNICODE_ENCODING
|
|||||||
from lib.request.basic import decodePage
|
from lib.request.basic import decodePage
|
||||||
from thirdparty.socks import socks
|
from thirdparty.socks import socks
|
||||||
|
|
||||||
|
|
||||||
def _search(dork):
|
def _search(dork):
|
||||||
"""
|
"""
|
||||||
This method performs the effective search on Google providing
|
This method performs the effective search on Google providing
|
||||||
|
|||||||
@@ -57,7 +57,7 @@ class SQLAlchemy(GenericConnector):
|
|||||||
if self.dialect == "sqlite":
|
if self.dialect == "sqlite":
|
||||||
engine = _sqlalchemy.create_engine(conf.direct, connect_args={"check_same_thread": False})
|
engine = _sqlalchemy.create_engine(conf.direct, connect_args={"check_same_thread": False})
|
||||||
elif self.dialect == "oracle":
|
elif self.dialect == "oracle":
|
||||||
engine = _sqlalchemy.create_engine(conf.direct, connect_args={"allow_twophase": False})
|
engine = _sqlalchemy.create_engine(conf.direct)
|
||||||
else:
|
else:
|
||||||
engine = _sqlalchemy.create_engine(conf.direct, connect_args={})
|
engine = _sqlalchemy.create_engine(conf.direct, connect_args={})
|
||||||
|
|
||||||
|
|||||||
@@ -12,7 +12,7 @@ PYVERSION = sys.version.split()[0]
|
|||||||
if PYVERSION >= "3" or PYVERSION < "2.6":
|
if PYVERSION >= "3" or PYVERSION < "2.6":
|
||||||
exit("[CRITICAL] incompatible Python version detected ('%s'). To successfully run sqlmap you'll have to use version 2.6.x or 2.7.x (visit 'https://www.python.org/downloads/')" % PYVERSION)
|
exit("[CRITICAL] incompatible Python version detected ('%s'). To successfully run sqlmap you'll have to use version 2.6.x or 2.7.x (visit 'https://www.python.org/downloads/')" % PYVERSION)
|
||||||
|
|
||||||
extensions = ("bz2", "gzip", "ssl", "sqlite3", "zlib")
|
extensions = ("bz2", "gzip", "pyexpat", "ssl", "sqlite3", "zlib")
|
||||||
try:
|
try:
|
||||||
for _ in extensions:
|
for _ in extensions:
|
||||||
__import__(_)
|
__import__(_)
|
||||||
|
|||||||
@@ -49,12 +49,10 @@ class xrange(object):
|
|||||||
return hash(self._slice)
|
return hash(self._slice)
|
||||||
|
|
||||||
def __cmp__(self, other):
|
def __cmp__(self, other):
|
||||||
return (cmp(type(self), type(other)) or
|
return (cmp(type(self), type(other)) or cmp(self._slice, other._slice))
|
||||||
cmp(self._slice, other._slice))
|
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return '%s(%r, %r, %r)' % (type(self).__name__,
|
return '%s(%r, %r, %r)' % (type(self).__name__, self.start, self.stop, self.step)
|
||||||
self.start, self.stop, self.step)
|
|
||||||
|
|
||||||
def __len__(self):
|
def __len__(self):
|
||||||
return self._len()
|
return self._len()
|
||||||
|
|||||||
@@ -19,9 +19,9 @@ from plugins.generic.connector import Connector as GenericConnector
|
|||||||
|
|
||||||
class Connector(GenericConnector):
|
class Connector(GenericConnector):
|
||||||
"""
|
"""
|
||||||
Homepage: http://code.google.com/p/ibm-db/
|
Homepage: https://github.com/ibmdb/python-ibmdb
|
||||||
User guide: http://code.google.com/p/ibm-db/wiki/README
|
User guide: https://github.com/ibmdb/python-ibmdb/wiki/README
|
||||||
API: http://www.python.org/dev/peps/pep-0249/
|
API: https://www.python.org/dev/peps/pep-0249/
|
||||||
License: Apache License 2.0
|
License: Apache License 2.0
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|||||||
@@ -5,7 +5,6 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
|||||||
See the file 'LICENSE' for copying permission
|
See the file 'LICENSE' for copying permission
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
from lib.core.data import logger
|
from lib.core.data import logger
|
||||||
from plugins.generic.enumeration import Enumeration as GenericEnumeration
|
from plugins.generic.enumeration import Enumeration as GenericEnumeration
|
||||||
|
|
||||||
|
|||||||
@@ -5,7 +5,6 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
|||||||
See the file 'LICENSE' for copying permission
|
See the file 'LICENSE' for copying permission
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
from lib.core.common import Backend
|
from lib.core.common import Backend
|
||||||
from lib.core.common import Format
|
from lib.core.common import Format
|
||||||
from lib.core.data import conf
|
from lib.core.data import conf
|
||||||
|
|||||||
@@ -19,9 +19,9 @@ from plugins.generic.connector import Connector as GenericConnector
|
|||||||
|
|
||||||
class Connector(GenericConnector):
|
class Connector(GenericConnector):
|
||||||
"""
|
"""
|
||||||
Homepage: http://code.google.com/p/ibm-db/
|
Homepage: https://github.com/ibmdb/python-ibmdb
|
||||||
User guide: http://code.google.com/p/ibm-db/wiki/README
|
User guide: https://github.com/ibmdb/python-ibmdb/wiki/README
|
||||||
API: http://www.python.org/dev/peps/pep-0249/
|
API: https://www.python.org/dev/peps/pep-0249/
|
||||||
License: Apache License 2.0
|
License: Apache License 2.0
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|||||||
@@ -15,7 +15,6 @@ from plugins.dbms.mssqlserver.syntax import Syntax
|
|||||||
from plugins.dbms.mssqlserver.takeover import Takeover
|
from plugins.dbms.mssqlserver.takeover import Takeover
|
||||||
from plugins.generic.misc import Miscellaneous
|
from plugins.generic.misc import Miscellaneous
|
||||||
|
|
||||||
|
|
||||||
class MSSQLServerMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover):
|
class MSSQLServerMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover):
|
||||||
"""
|
"""
|
||||||
This class defines Microsoft SQL Server methods
|
This class defines Microsoft SQL Server methods
|
||||||
|
|||||||
@@ -21,9 +21,9 @@ from plugins.generic.connector import Connector as GenericConnector
|
|||||||
|
|
||||||
class Connector(GenericConnector):
|
class Connector(GenericConnector):
|
||||||
"""
|
"""
|
||||||
Homepage: http://pymssql.sourceforge.net/
|
Homepage: http://www.pymssql.org/en/stable/
|
||||||
User guide: http://pymssql.sourceforge.net/examples_pymssql.php
|
User guide: http://www.pymssql.org/en/stable/pymssql_examples.html
|
||||||
API: http://pymssql.sourceforge.net/ref_pymssql.php
|
API: http://www.pymssql.org/en/stable/ref/pymssql.html
|
||||||
Debian package: python-pymssql
|
Debian package: python-pymssql
|
||||||
License: LGPL
|
License: LGPL
|
||||||
|
|
||||||
|
|||||||
@@ -136,8 +136,8 @@ class Fingerprint(GenericFingerprint):
|
|||||||
self.createSupportTbl(self.fileTblName, self.tblField, "varchar(1000)")
|
self.createSupportTbl(self.fileTblName, self.tblField, "varchar(1000)")
|
||||||
inject.goStacked("INSERT INTO %s(%s) VALUES (%s)" % (self.fileTblName, self.tblField, "@@VERSION"))
|
inject.goStacked("INSERT INTO %s(%s) VALUES (%s)" % (self.fileTblName, self.tblField, "@@VERSION"))
|
||||||
|
|
||||||
# Reference: http://en.wikipedia.org/wiki/Comparison_of_Microsoft_Windows_versions
|
# Reference: https://en.wikipedia.org/wiki/Comparison_of_Microsoft_Windows_versions
|
||||||
# http://en.wikipedia.org/wiki/Windows_NT#Releases
|
# https://en.wikipedia.org/wiki/Windows_NT#Releases
|
||||||
versions = {
|
versions = {
|
||||||
"NT": ("4.0", (6, 5, 4, 3, 2, 1)),
|
"NT": ("4.0", (6, 5, 4, 3, 2, 1)),
|
||||||
"2000": ("5.0", (4, 3, 2, 1)),
|
"2000": ("5.0", (4, 3, 2, 1)),
|
||||||
|
|||||||
@@ -26,8 +26,8 @@ class Takeover(GenericTakeover):
|
|||||||
def spHeapOverflow(self):
|
def spHeapOverflow(self):
|
||||||
"""
|
"""
|
||||||
References:
|
References:
|
||||||
* http://www.microsoft.com/technet/security/bulletin/MS09-004.mspx
|
* https://docs.microsoft.com/en-us/security-updates/securitybulletins/2009/ms09-004
|
||||||
* http://support.microsoft.com/kb/959420
|
* https://support.microsoft.com/en-us/help/959420/ms09-004-vulnerabilities-in-microsoft-sql-server-could-allow-remote-co
|
||||||
"""
|
"""
|
||||||
|
|
||||||
returns = {
|
returns = {
|
||||||
|
|||||||
@@ -183,8 +183,15 @@ class Fingerprint(GenericFingerprint):
|
|||||||
# reading information_schema on some platforms is causing annoying timeout exits
|
# reading information_schema on some platforms is causing annoying timeout exits
|
||||||
# Reference: http://bugs.mysql.com/bug.php?id=15855
|
# Reference: http://bugs.mysql.com/bug.php?id=15855
|
||||||
|
|
||||||
|
# Determine if it is MySQL >= 8.0.0
|
||||||
|
if inject.checkBooleanExpression("ISNULL(JSON_STORAGE_FREE(NULL))"):
|
||||||
|
kb.data.has_information_schema = True
|
||||||
|
Backend.setVersion(">= 8.0.0")
|
||||||
|
setDbms("%s 8" % DBMS.MYSQL)
|
||||||
|
self.getBanner()
|
||||||
|
|
||||||
# Determine if it is MySQL >= 5.0.0
|
# Determine if it is MySQL >= 5.0.0
|
||||||
if inject.checkBooleanExpression("ISNULL(TIMESTAMPADD(MINUTE,[RANDNUM],NULL))"):
|
elif inject.checkBooleanExpression("ISNULL(TIMESTAMPADD(MINUTE,[RANDNUM],NULL))"):
|
||||||
kb.data.has_information_schema = True
|
kb.data.has_information_schema = True
|
||||||
Backend.setVersion(">= 5.0.0")
|
Backend.setVersion(">= 5.0.0")
|
||||||
setDbms("%s 5" % DBMS.MYSQL)
|
setDbms("%s 5" % DBMS.MYSQL)
|
||||||
@@ -196,9 +203,17 @@ class Fingerprint(GenericFingerprint):
|
|||||||
infoMsg = "actively fingerprinting %s" % DBMS.MYSQL
|
infoMsg = "actively fingerprinting %s" % DBMS.MYSQL
|
||||||
logger.info(infoMsg)
|
logger.info(infoMsg)
|
||||||
|
|
||||||
# Check if it is MySQL >= 5.5.0
|
# Check if it is MySQL >= 5.7
|
||||||
if inject.checkBooleanExpression("TO_SECONDS(950501)>0"):
|
if inject.checkBooleanExpression("ISNULL(JSON_QUOTE(NULL))"):
|
||||||
Backend.setVersion(">= 5.5.0")
|
Backend.setVersion(">= 5.7")
|
||||||
|
|
||||||
|
# Check if it is MySQL >= 5.6
|
||||||
|
elif inject.checkBooleanExpression("ISNULL(VALIDATE_PASSWORD_STRENGTH(NULL))"):
|
||||||
|
Backend.setVersion(">= 5.6")
|
||||||
|
|
||||||
|
# Check if it is MySQL >= 5.5
|
||||||
|
elif inject.checkBooleanExpression("TO_SECONDS(950501)>0"):
|
||||||
|
Backend.setVersion(">= 5.5")
|
||||||
|
|
||||||
# Check if it is MySQL >= 5.1.2 and < 5.5.0
|
# Check if it is MySQL >= 5.1.2 and < 5.5.0
|
||||||
elif inject.checkBooleanExpression("@@table_open_cache=@@table_open_cache"):
|
elif inject.checkBooleanExpression("@@table_open_cache=@@table_open_cache"):
|
||||||
|
|||||||
@@ -24,10 +24,10 @@ os.environ["NLS_LANG"] = ".AL32UTF8"
|
|||||||
|
|
||||||
class Connector(GenericConnector):
|
class Connector(GenericConnector):
|
||||||
"""
|
"""
|
||||||
Homepage: http://cx-oracle.sourceforge.net/
|
Homepage: https://oracle.github.io/python-cx_Oracle/
|
||||||
User guide: http://cx-oracle.sourceforge.net/README.txt
|
User https://cx-oracle.readthedocs.io/en/latest/
|
||||||
API: http://cx-oracle.sourceforge.net/html/index.html
|
API: https://wiki.python.org/moin/DatabaseProgramming
|
||||||
License: http://cx-oracle.sourceforge.net/LICENSE.txt
|
License: https://cx-oracle.readthedocs.io/en/latest/license.html#license
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
|
|||||||
@@ -68,23 +68,23 @@ class Fingerprint(GenericFingerprint):
|
|||||||
infoMsg = "testing %s" % DBMS.ORACLE
|
infoMsg = "testing %s" % DBMS.ORACLE
|
||||||
logger.info(infoMsg)
|
logger.info(infoMsg)
|
||||||
|
|
||||||
# NOTE: SELECT ROWNUM=ROWNUM FROM DUAL does not work connecting
|
# NOTE: SELECT LENGTH(SYSDATE)=LENGTH(SYSDATE) FROM DUAL does
|
||||||
# directly to the Oracle database
|
# not work connecting directly to the Oracle database
|
||||||
if conf.direct:
|
if conf.direct:
|
||||||
result = True
|
result = True
|
||||||
else:
|
else:
|
||||||
result = inject.checkBooleanExpression("ROWNUM=ROWNUM")
|
result = inject.checkBooleanExpression("LENGTH(SYSDATE)=LENGTH(SYSDATE)")
|
||||||
|
|
||||||
if result:
|
if result:
|
||||||
infoMsg = "confirming %s" % DBMS.ORACLE
|
infoMsg = "confirming %s" % DBMS.ORACLE
|
||||||
logger.info(infoMsg)
|
logger.info(infoMsg)
|
||||||
|
|
||||||
# NOTE: SELECT LENGTH(SYSDATE)=LENGTH(SYSDATE) FROM DUAL does
|
# NOTE: SELECT NVL(RAWTOHEX([RANDNUM1]),[RANDNUM1])=RAWTOHEX([RANDNUM1]) FROM DUAL does
|
||||||
# not work connecting directly to the Oracle database
|
# not work connecting directly to the Oracle database
|
||||||
if conf.direct:
|
if conf.direct:
|
||||||
result = True
|
result = True
|
||||||
else:
|
else:
|
||||||
result = inject.checkBooleanExpression("LENGTH(SYSDATE)=LENGTH(SYSDATE)")
|
result = inject.checkBooleanExpression("NVL(RAWTOHEX([RANDNUM1]),[RANDNUM1])=RAWTOHEX([RANDNUM1])")
|
||||||
|
|
||||||
if not result:
|
if not result:
|
||||||
warnMsg = "the back-end DBMS is not %s" % DBMS.ORACLE
|
warnMsg = "the back-end DBMS is not %s" % DBMS.ORACLE
|
||||||
|
|||||||
@@ -60,7 +60,7 @@ class Fingerprint(GenericFingerprint):
|
|||||||
"""
|
"""
|
||||||
References for fingerprint:
|
References for fingerprint:
|
||||||
|
|
||||||
* http://www.postgresql.org/docs/9.1/interactive/release.html (up to 9.1.3)
|
* https://www.postgresql.org/docs/current/static/release.html
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if not conf.extensiveFp and Backend.isDbmsWithin(PGSQL_ALIASES):
|
if not conf.extensiveFp and Backend.isDbmsWithin(PGSQL_ALIASES):
|
||||||
@@ -97,8 +97,12 @@ class Fingerprint(GenericFingerprint):
|
|||||||
infoMsg = "actively fingerprinting %s" % DBMS.PGSQL
|
infoMsg = "actively fingerprinting %s" % DBMS.PGSQL
|
||||||
logger.info(infoMsg)
|
logger.info(infoMsg)
|
||||||
|
|
||||||
if inject.checkBooleanExpression("TO_JSONB(1) IS NOT NULL"):
|
if inject.checkBooleanExpression("XMLTABLE(NULL) IS NULL"):
|
||||||
Backend.setVersion(">= 9.5.0")
|
Backend.setVersion(">= 10.0")
|
||||||
|
elif inject.checkBooleanExpression("SIND(0)=0"):
|
||||||
|
Backend.setVersion(">= 9.6.0", "< 10.0")
|
||||||
|
elif inject.checkBooleanExpression("TO_JSONB(1) IS NOT NULL"):
|
||||||
|
Backend.setVersion(">= 9.5.0", "< 9.6.0")
|
||||||
elif inject.checkBooleanExpression("JSON_TYPEOF(NULL) IS NULL"):
|
elif inject.checkBooleanExpression("JSON_TYPEOF(NULL) IS NULL"):
|
||||||
Backend.setVersionList([">= 9.4.0", "< 9.5.0"])
|
Backend.setVersionList([">= 9.4.0", "< 9.5.0"])
|
||||||
elif inject.checkBooleanExpression("ARRAY_REPLACE(NULL,1,1) IS NULL"):
|
elif inject.checkBooleanExpression("ARRAY_REPLACE(NULL,1,1) IS NULL"):
|
||||||
|
|||||||
@@ -19,7 +19,6 @@ from lib.core.exception import SqlmapConnectionException
|
|||||||
from lib.core.exception import SqlmapMissingDependence
|
from lib.core.exception import SqlmapMissingDependence
|
||||||
from plugins.generic.connector import Connector as GenericConnector
|
from plugins.generic.connector import Connector as GenericConnector
|
||||||
|
|
||||||
|
|
||||||
class Connector(GenericConnector):
|
class Connector(GenericConnector):
|
||||||
"""
|
"""
|
||||||
Homepage: http://pysqlite.googlecode.com/ and http://packages.ubuntu.com/quantal/python-sqlite
|
Homepage: http://pysqlite.googlecode.com/ and http://packages.ubuntu.com/quantal/python-sqlite
|
||||||
|
|||||||
@@ -290,6 +290,24 @@ class Databases:
|
|||||||
db = safeSQLIdentificatorNaming(db)
|
db = safeSQLIdentificatorNaming(db)
|
||||||
table = safeSQLIdentificatorNaming(unArrayizeValue(table), True)
|
table = safeSQLIdentificatorNaming(unArrayizeValue(table), True)
|
||||||
|
|
||||||
|
if conf.getComments:
|
||||||
|
_ = queries[Backend.getIdentifiedDbms()].table_comment
|
||||||
|
if hasattr(_, "query"):
|
||||||
|
if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2):
|
||||||
|
query = _.query % (unsafeSQLIdentificatorNaming(db.upper()), unsafeSQLIdentificatorNaming(table.upper()))
|
||||||
|
else:
|
||||||
|
query = _.query % (unsafeSQLIdentificatorNaming(db), unsafeSQLIdentificatorNaming(table))
|
||||||
|
|
||||||
|
comment = unArrayizeValue(inject.getValue(query, blind=False, time=False))
|
||||||
|
if not isNoneValue(comment):
|
||||||
|
infoMsg = "retrieved comment '%s' for table '%s' " % (comment, unsafeSQLIdentificatorNaming(table))
|
||||||
|
infoMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(db)
|
||||||
|
logger.info(infoMsg)
|
||||||
|
else:
|
||||||
|
warnMsg = "on %s it is not " % Backend.getIdentifiedDbms()
|
||||||
|
warnMsg += "possible to get column comments"
|
||||||
|
singleTimeWarnMessage(warnMsg)
|
||||||
|
|
||||||
if db not in kb.data.cachedTables:
|
if db not in kb.data.cachedTables:
|
||||||
kb.data.cachedTables[db] = [table]
|
kb.data.cachedTables[db] = [table]
|
||||||
else:
|
else:
|
||||||
@@ -353,6 +371,24 @@ class Databases:
|
|||||||
table = safeSQLIdentificatorNaming(table, True)
|
table = safeSQLIdentificatorNaming(table, True)
|
||||||
tables.append(table)
|
tables.append(table)
|
||||||
|
|
||||||
|
if conf.getComments:
|
||||||
|
_ = queries[Backend.getIdentifiedDbms()].table_comment
|
||||||
|
if hasattr(_, "query"):
|
||||||
|
if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2):
|
||||||
|
query = _.query % (unsafeSQLIdentificatorNaming(db.upper()), unsafeSQLIdentificatorNaming(table.upper()))
|
||||||
|
else:
|
||||||
|
query = _.query % (unsafeSQLIdentificatorNaming(db), unsafeSQLIdentificatorNaming(table))
|
||||||
|
|
||||||
|
comment = unArrayizeValue(inject.getValue(query, union=False, error=False))
|
||||||
|
if not isNoneValue(comment):
|
||||||
|
infoMsg = "retrieved comment '%s' for table '%s' " % (comment, unsafeSQLIdentificatorNaming(table))
|
||||||
|
infoMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(db)
|
||||||
|
logger.info(infoMsg)
|
||||||
|
else:
|
||||||
|
warnMsg = "on %s it is not " % Backend.getIdentifiedDbms()
|
||||||
|
warnMsg += "possible to get column comments"
|
||||||
|
singleTimeWarnMessage(warnMsg)
|
||||||
|
|
||||||
if tables:
|
if tables:
|
||||||
kb.data.cachedTables[db] = tables
|
kb.data.cachedTables[db] = tables
|
||||||
else:
|
else:
|
||||||
|
|||||||
24
sqlmap.conf
24
sqlmap.conf
@@ -241,7 +241,7 @@ skipStatic = False
|
|||||||
# Regexp to exclude parameters from testing (e.g. "ses").
|
# Regexp to exclude parameters from testing (e.g. "ses").
|
||||||
paramExclude =
|
paramExclude =
|
||||||
|
|
||||||
# Force back-end DBMS to this value. If this option is set, the back-end
|
# Force back-end DBMS to provided value. If this option is set, the back-end
|
||||||
# DBMS identification process will be minimized as needed.
|
# DBMS identification process will be minimized as needed.
|
||||||
# If not set, sqlmap will detect back-end DBMS automatically by default.
|
# If not set, sqlmap will detect back-end DBMS automatically by default.
|
||||||
# Valid: mssql, mysql, mysql 4, mysql 5, oracle, pgsql, sqlite, sqlite3,
|
# Valid: mssql, mysql, mysql 4, mysql 5, oracle, pgsql, sqlite, sqlite3,
|
||||||
@@ -256,7 +256,7 @@ dbms =
|
|||||||
# Syntax: username:password
|
# Syntax: username:password
|
||||||
dbmsCred =
|
dbmsCred =
|
||||||
|
|
||||||
# Force back-end DBMS operating system to this value. If this option is
|
# Force back-end DBMS operating system to provided value. If this option is
|
||||||
# set, the back-end DBMS identification process will be minimized as
|
# set, the back-end DBMS identification process will be minimized as
|
||||||
# needed.
|
# needed.
|
||||||
# If not set, sqlmap will detect back-end DBMS operating system
|
# If not set, sqlmap will detect back-end DBMS operating system
|
||||||
@@ -367,28 +367,32 @@ tech = BEUSTQ
|
|||||||
# Default: 5
|
# Default: 5
|
||||||
timeSec = 5
|
timeSec = 5
|
||||||
|
|
||||||
# Range of columns to test for
|
# Range of columns to test for.
|
||||||
# Valid: range of integers
|
# Valid: range of integers
|
||||||
# Example: 1-10
|
# Example: 1-10
|
||||||
uCols =
|
uCols =
|
||||||
|
|
||||||
# Character to use for bruteforcing number of columns
|
# Character to use for bruteforcing number of columns.
|
||||||
# Valid: string
|
# Valid: string
|
||||||
# Example: NULL
|
# Example: NULL
|
||||||
uChar =
|
uChar =
|
||||||
|
|
||||||
# Table to use in FROM part of UNION query SQL injection
|
# Table to use in FROM part of UNION query SQL injection.
|
||||||
# Valid: string
|
# Valid: string
|
||||||
# Example: INFORMATION_SCHEMA.COLLATIONS
|
# Example: INFORMATION_SCHEMA.COLLATIONS
|
||||||
uFrom =
|
uFrom =
|
||||||
|
|
||||||
# Domain name used for DNS exfiltration attack
|
# Domain name used for DNS exfiltration attack.
|
||||||
# Valid: string
|
# Valid: string
|
||||||
dnsDomain =
|
dnsDomain =
|
||||||
|
|
||||||
# Resulting page URL searched for second-order response
|
# Resulting page URL searched for second-order response.
|
||||||
# Valid: string
|
# Valid: string
|
||||||
secondOrder =
|
secondUrl =
|
||||||
|
|
||||||
|
# Load second-order HTTP request from file.
|
||||||
|
# Valid: string
|
||||||
|
secondReq =
|
||||||
|
|
||||||
|
|
||||||
[Fingerprint]
|
[Fingerprint]
|
||||||
@@ -481,7 +485,7 @@ dumpAll = False
|
|||||||
# Valid: True or False
|
# Valid: True or False
|
||||||
search = False
|
search = False
|
||||||
|
|
||||||
# Retrieve back-end database management system comments.
|
# Check for database management system database comments during enumeration.
|
||||||
# Valid: True or False
|
# Valid: True or False
|
||||||
getComments = False
|
getComments = False
|
||||||
|
|
||||||
@@ -710,7 +714,7 @@ forms = False
|
|||||||
# Valid: True or False
|
# Valid: True or False
|
||||||
freshQueries = False
|
freshQueries = False
|
||||||
|
|
||||||
# Use DBMS hex function(s) for data retrieval.
|
# Use hex conversion during data retrieval.
|
||||||
# Valid: True or False
|
# Valid: True or False
|
||||||
hexConvert = False
|
hexConvert = False
|
||||||
|
|
||||||
|
|||||||
37
sqlmap.py
37
sqlmap.py
@@ -57,6 +57,7 @@ try:
|
|||||||
from lib.core.exception import SqlmapUserQuitException
|
from lib.core.exception import SqlmapUserQuitException
|
||||||
from lib.core.option import initOptions
|
from lib.core.option import initOptions
|
||||||
from lib.core.option import init
|
from lib.core.option import init
|
||||||
|
from lib.core.patch import dirtyPatches
|
||||||
from lib.core.settings import GIT_PAGE
|
from lib.core.settings import GIT_PAGE
|
||||||
from lib.core.settings import IS_WIN
|
from lib.core.settings import IS_WIN
|
||||||
from lib.core.settings import LEGAL_DISCLAIMER
|
from lib.core.settings import LEGAL_DISCLAIMER
|
||||||
@@ -108,13 +109,13 @@ def checkEnvironment():
|
|||||||
for _ in ("SqlmapBaseException", "SqlmapShellQuitException", "SqlmapSilentQuitException", "SqlmapUserQuitException"):
|
for _ in ("SqlmapBaseException", "SqlmapShellQuitException", "SqlmapSilentQuitException", "SqlmapUserQuitException"):
|
||||||
globals()[_] = getattr(sys.modules["lib.core.exception"], _)
|
globals()[_] = getattr(sys.modules["lib.core.exception"], _)
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
"""
|
"""
|
||||||
Main function of sqlmap when running from command line.
|
Main function of sqlmap when running from command line.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
dirtyPatches()
|
||||||
checkEnvironment()
|
checkEnvironment()
|
||||||
setPaths(modulePath())
|
setPaths(modulePath())
|
||||||
banner()
|
banner()
|
||||||
@@ -142,10 +143,7 @@ def main():
|
|||||||
|
|
||||||
if not conf.updateAll:
|
if not conf.updateAll:
|
||||||
# Postponed imports (faster start)
|
# Postponed imports (faster start)
|
||||||
if conf.profile:
|
if conf.smokeTest:
|
||||||
from lib.core.profiling import profile
|
|
||||||
profile()
|
|
||||||
elif conf.smokeTest:
|
|
||||||
from lib.core.testing import smokeTest
|
from lib.core.testing import smokeTest
|
||||||
smokeTest()
|
smokeTest()
|
||||||
elif conf.liveTest:
|
elif conf.liveTest:
|
||||||
@@ -153,15 +151,20 @@ def main():
|
|||||||
liveTest()
|
liveTest()
|
||||||
else:
|
else:
|
||||||
from lib.controller.controller import start
|
from lib.controller.controller import start
|
||||||
try:
|
if conf.profile:
|
||||||
start()
|
from lib.core.profiling import profile
|
||||||
except thread.error as ex:
|
globals()["start"] = start
|
||||||
if "can't start new thread" in getSafeExString(ex):
|
profile()
|
||||||
errMsg = "unable to start new threads. Please check OS (u)limits"
|
else:
|
||||||
logger.critical(errMsg)
|
try:
|
||||||
raise SystemExit
|
start()
|
||||||
else:
|
except thread.error as ex:
|
||||||
raise
|
if "can't start new thread" in getSafeExString(ex):
|
||||||
|
errMsg = "unable to start new threads. Please check OS (u)limits"
|
||||||
|
logger.critical(errMsg)
|
||||||
|
raise SystemExit
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
|
||||||
except SqlmapUserQuitException:
|
except SqlmapUserQuitException:
|
||||||
errMsg = "user quit"
|
errMsg = "user quit"
|
||||||
@@ -265,8 +268,8 @@ def main():
|
|||||||
raise SystemExit
|
raise SystemExit
|
||||||
|
|
||||||
elif all(_ in excMsg for _ in ("twophase", "sqlalchemy")):
|
elif all(_ in excMsg for _ in ("twophase", "sqlalchemy")):
|
||||||
errMsg = "please update the 'sqlalchemy' package "
|
errMsg = "please update the 'sqlalchemy' package (>= 1.1.11) "
|
||||||
errMsg += "(Reference: https://github.com/apache/incubator-superset/issues/3447)"
|
errMsg += "(Reference: https://qiita.com/tkprof/items/7d7b2d00df9c5f16fffe)"
|
||||||
logger.error(errMsg)
|
logger.error(errMsg)
|
||||||
raise SystemExit
|
raise SystemExit
|
||||||
|
|
||||||
@@ -398,4 +401,4 @@ if __name__ == "__main__":
|
|||||||
main()
|
main()
|
||||||
else:
|
else:
|
||||||
# cancelling postponed imports (because of Travis CI checks)
|
# cancelling postponed imports (because of Travis CI checks)
|
||||||
from lib.controller.controller import start
|
from lib.controller.controller import start
|
||||||
|
|||||||
41
tamper/0x2char.py
Normal file
41
tamper/0x2char.py
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
|
||||||
|
"""
|
||||||
|
Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
||||||
|
See the file 'LICENSE' for copying permission
|
||||||
|
"""
|
||||||
|
|
||||||
|
import re
|
||||||
|
|
||||||
|
from lib.core.enums import PRIORITY
|
||||||
|
|
||||||
|
__priority__ = PRIORITY.NORMAL
|
||||||
|
|
||||||
|
def dependencies():
|
||||||
|
pass
|
||||||
|
|
||||||
|
def tamper(payload, **kwargs):
|
||||||
|
"""
|
||||||
|
Replaces each (MySQL) 0x<hex> encoded string with equivalent CONCAT(CHAR(),...) counterpart
|
||||||
|
|
||||||
|
Tested against:
|
||||||
|
* MySQL 4, 5.0 and 5.5
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
* Useful in cases when web application does the upper casing
|
||||||
|
|
||||||
|
>>> tamper('SELECT 0xdeadbeef')
|
||||||
|
'SELECT CONCAT(CHAR(222),CHAR(173),CHAR(190),CHAR(239))'
|
||||||
|
"""
|
||||||
|
|
||||||
|
retVal = payload
|
||||||
|
|
||||||
|
if payload:
|
||||||
|
for match in re.finditer(r"\b0x([0-9a-f]+)\b", retVal):
|
||||||
|
if len(match.group(1)) > 2:
|
||||||
|
result = "CONCAT(%s)" % ','.join("CHAR(%d)" % ord(_) for _ in match.group(1).decode("hex"))
|
||||||
|
else:
|
||||||
|
result = "CHAR(%d)" % ord(match.group(1).decode("hex"))
|
||||||
|
retVal = retVal.replace(match.group(0), result)
|
||||||
|
|
||||||
|
return retVal
|
||||||
@@ -46,7 +46,7 @@ def tamper(payload, **kwargs):
|
|||||||
_ = "%s %s NOT BETWEEN 0 AND %s" % (match.group(2), match.group(4), match.group(5))
|
_ = "%s %s NOT BETWEEN 0 AND %s" % (match.group(2), match.group(4), match.group(5))
|
||||||
retVal = retVal.replace(match.group(0), _)
|
retVal = retVal.replace(match.group(0), _)
|
||||||
else:
|
else:
|
||||||
retVal = re.sub(r"\s*>\s*(\d+|'[^']+'|\w+\(\d+\))", " NOT BETWEEN 0 AND \g<1>", payload)
|
retVal = re.sub(r"\s*>\s*(\d+|'[^']+'|\w+\(\d+\))", r" NOT BETWEEN 0 AND \g<1>", payload)
|
||||||
|
|
||||||
if retVal == payload:
|
if retVal == payload:
|
||||||
match = re.search(r"(?i)(\b(AND|OR)\b\s+)(?!.*\b(AND|OR)\b)([^=]+?)\s*=\s*(\w+)\s*", payload)
|
match = re.search(r"(?i)(\b(AND|OR)\b\s+)(?!.*\b(AND|OR)\b)([^=]+?)\s*=\s*(\w+)\s*", payload)
|
||||||
|
|||||||
@@ -35,6 +35,6 @@ def tamper(payload, **kwargs):
|
|||||||
retVal = payload
|
retVal = payload
|
||||||
|
|
||||||
if payload:
|
if payload:
|
||||||
retVal = re.sub(r"\b(\w+)\(", "\g<1>/**/(", retVal)
|
retVal = re.sub(r"\b(\w+)\(", r"\g<1>/**/(", retVal)
|
||||||
|
|
||||||
return retVal
|
return retVal
|
||||||
|
|||||||
@@ -22,6 +22,6 @@ def tamper(payload, **kwargs):
|
|||||||
retVal = payload
|
retVal = payload
|
||||||
|
|
||||||
if payload:
|
if payload:
|
||||||
retVal = re.sub(r"(?i)(information_schema)\.", "\g<1>/**/.", payload)
|
retVal = re.sub(r"(?i)(information_schema)\.", r"\g<1>/**/.", payload)
|
||||||
|
|
||||||
return retVal
|
return retVal
|
||||||
|
|||||||
@@ -43,7 +43,7 @@ def tamper(payload, **kwargs):
|
|||||||
words.add(word)
|
words.add(word)
|
||||||
|
|
||||||
for word in words:
|
for word in words:
|
||||||
retVal = re.sub("(?<=\W)%s(?=[^A-Za-z_(]|\Z)" % word, "%s%s%s" % (' ' * random.randrange(1, 4), word, ' ' * random.randrange(1, 4)), retVal)
|
retVal = re.sub(r"(?<=\W)%s(?=[^A-Za-z_(]|\Z)" % word, "%s%s%s" % (' ' * random.randrange(1, 4), word, ' ' * random.randrange(1, 4)), retVal)
|
||||||
retVal = re.sub("(?<=\W)%s(?=[(])" % word, "%s%s" % (' ' * random.randrange(1, 4), word), retVal)
|
retVal = re.sub(r"(?<=\W)%s(?=[(])" % word, "%s%s" % (' ' * random.randrange(1, 4), word), retVal)
|
||||||
|
|
||||||
return retVal
|
return retVal
|
||||||
|
|||||||
972
thirdparty/socks/socks.py
vendored
972
thirdparty/socks/socks.py
vendored
File diff suppressed because it is too large
Load Diff
134
txt/checksum.md5
134
txt/checksum.md5
@@ -5,10 +5,10 @@ b0eb597c613afeff9d62898cf4c67a56 extra/cloak/cloak.py
|
|||||||
e0911386106b95d2ba4b12d651b2eb16 extra/dbgtool/dbgtool.py
|
e0911386106b95d2ba4b12d651b2eb16 extra/dbgtool/dbgtool.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 extra/dbgtool/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 extra/dbgtool/__init__.py
|
||||||
acba8b5dc93db0fe6b2b04ff0138c33c extra/icmpsh/icmpsh.exe_
|
acba8b5dc93db0fe6b2b04ff0138c33c extra/icmpsh/icmpsh.exe_
|
||||||
fe39e5c315d63afff5cb99ec42fc883f extra/icmpsh/icmpsh_m.py
|
708e9fd35dabcbfcd10e91bbc14f091f extra/icmpsh/icmpsh_m.py
|
||||||
2d020d2bdcee1170805f48839fdb89df extra/icmpsh/__init__.py
|
2d020d2bdcee1170805f48839fdb89df extra/icmpsh/__init__.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 extra/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 extra/__init__.py
|
||||||
27629e01ba722271c990ad4b27151917 extra/mssqlsig/update.py
|
fe141ec3178a46e7151c7f34bb747c68 extra/mssqlsig/update.py
|
||||||
ff90cb0366f7cefbdd6e573e27e6238c extra/runcmd/runcmd.exe_
|
ff90cb0366f7cefbdd6e573e27e6238c extra/runcmd/runcmd.exe_
|
||||||
1e5532ede194ac9c083891c2f02bca93 extra/safe2bin/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 extra/safe2bin/__init__.py
|
||||||
b6c0f2047e9bea90f4d5c5806c0f6a9a extra/safe2bin/safe2bin.py
|
b6c0f2047e9bea90f4d5c5806c0f6a9a extra/safe2bin/safe2bin.py
|
||||||
@@ -16,93 +16,95 @@ d229479d02d21b29f209143cb0547780 extra/shellcodeexec/linux/shellcodeexec.x32_
|
|||||||
2fe2f94eebc62f7614f0391a8a90104f extra/shellcodeexec/linux/shellcodeexec.x64_
|
2fe2f94eebc62f7614f0391a8a90104f extra/shellcodeexec/linux/shellcodeexec.x64_
|
||||||
c55b400b72acc43e0e59c87dd8bb8d75 extra/shellcodeexec/windows/shellcodeexec.x32.exe_
|
c55b400b72acc43e0e59c87dd8bb8d75 extra/shellcodeexec/windows/shellcodeexec.x32.exe_
|
||||||
220745c50d375dad7aefebf8ca3611ef extra/shutils/duplicates.py
|
220745c50d375dad7aefebf8ca3611ef extra/shutils/duplicates.py
|
||||||
|
e4805169a081b834ca51a60a150c7247 extra/shutils/newlines.py
|
||||||
71b9d4357c31db013ecda27433830090 extra/shutils/pylint.py
|
71b9d4357c31db013ecda27433830090 extra/shutils/pylint.py
|
||||||
c88d66597f4aab719bde4542b0a1a6e0 extra/shutils/regressiontest.py
|
1056d1112ba5130868178cb495d22b1d extra/shutils/regressiontest.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 extra/sqlharvest/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 extra/sqlharvest/__init__.py
|
||||||
b3e60ea4e18a65c48515d04aab28ff68 extra/sqlharvest/sqlharvest.py
|
b3e60ea4e18a65c48515d04aab28ff68 extra/sqlharvest/sqlharvest.py
|
||||||
0f581182871148b0456a691ae85b04c0 lib/controller/action.py
|
0f581182871148b0456a691ae85b04c0 lib/controller/action.py
|
||||||
aea19b45c6154035a689954719c753dc lib/controller/checks.py
|
94872ce72dc2628cdedf2eb82cba716e lib/controller/checks.py
|
||||||
c414cecdb0472c92cf50ed5b01e4438c lib/controller/controller.py
|
c414cecdb0472c92cf50ed5b01e4438c lib/controller/controller.py
|
||||||
c7443613a0a2505b1faec931cee2a6ef lib/controller/handler.py
|
c7443613a0a2505b1faec931cee2a6ef lib/controller/handler.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/controller/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/controller/__init__.py
|
||||||
b1990c7805943f0c973a853bba981d96 lib/core/agent.py
|
0adf547455a76dc71e6a599e52da1ed9 lib/core/agent.py
|
||||||
fd8f239e259afaf5f24bcf34a0ad187f lib/core/bigarray.py
|
fd8f239e259afaf5f24bcf34a0ad187f lib/core/bigarray.py
|
||||||
f42e346d33199b4f663cff6efe2be775 lib/core/common.py
|
acec51826b280ad96dedbb56515e3988 lib/core/common.py
|
||||||
0d082da16c388b3445e656e0760fb582 lib/core/convert.py
|
0d082da16c388b3445e656e0760fb582 lib/core/convert.py
|
||||||
9f87391b6a3395f7f50830b391264f27 lib/core/data.py
|
9f87391b6a3395f7f50830b391264f27 lib/core/data.py
|
||||||
72016ea5c994a711a262fd64572a0fcd lib/core/datatype.py
|
72016ea5c994a711a262fd64572a0fcd lib/core/datatype.py
|
||||||
04638422b6ad1613238a9abf4fdf6491 lib/core/decorators.py
|
4086fb55f42e27de5330505605baad0f lib/core/decorators.py
|
||||||
fbb55cc6100318ff922957b6577dc58f lib/core/defaults.py
|
fbb55cc6100318ff922957b6577dc58f lib/core/defaults.py
|
||||||
da98f5288aad57855c6d287ba3b397a1 lib/core/dicts.py
|
db165596ef0a3e19ec59c24192bb318d lib/core/dicts.py
|
||||||
9ea8a043030796e6faef7f7e957729d5 lib/core/dump.py
|
9ea8a043030796e6faef7f7e957729d5 lib/core/dump.py
|
||||||
bfffdc74a93ff647c49b79c215d96d8a lib/core/enums.py
|
ab3f4f3e3019add5f4a2e28f7e8748a4 lib/core/enums.py
|
||||||
cada93357a7321655927fc9625b3bfec lib/core/exception.py
|
cada93357a7321655927fc9625b3bfec lib/core/exception.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/core/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/core/__init__.py
|
||||||
458a194764805cd8312c14ecd4be4d1e lib/core/log.py
|
458a194764805cd8312c14ecd4be4d1e lib/core/log.py
|
||||||
c9a56e58984420a5abb7a3f7aadc196d lib/core/optiondict.py
|
13c0a490b5a928b64236b4a15e578267 lib/core/optiondict.py
|
||||||
83345a6b0b7e187d2cbcc280a509f03e lib/core/option.py
|
c82dee0f62e729213b92f5ec85f74b70 lib/core/option.py
|
||||||
7cfd04e583cca782b843f6f6d973981a lib/core/profiling.py
|
c8c386d644d57c659d74542f5f57f632 lib/core/patch.py
|
||||||
|
6783160150b4711d02c56ee2beadffdb lib/core/profiling.py
|
||||||
6f654e1715571eff68a0f8af3d62dcf8 lib/core/readlineng.py
|
6f654e1715571eff68a0f8af3d62dcf8 lib/core/readlineng.py
|
||||||
0c3eef46bdbf87e29a3f95f90240d192 lib/core/replication.py
|
0c3eef46bdbf87e29a3f95f90240d192 lib/core/replication.py
|
||||||
a7db43859b61569b601b97f187dd31c5 lib/core/revision.py
|
a7db43859b61569b601b97f187dd31c5 lib/core/revision.py
|
||||||
fcb74fcc9577523524659ec49e2e964b lib/core/session.py
|
fcb74fcc9577523524659ec49e2e964b lib/core/session.py
|
||||||
1b801d825811ee4362e07e568e8a928e lib/core/settings.py
|
9e328a8f6e8e76c8d78779a8041d47c7 lib/core/settings.py
|
||||||
0dfc2ed40adf72e302291f6ecd4406f6 lib/core/shell.py
|
dd68a9d02fccb4fa1428b20e15b0db5d lib/core/shell.py
|
||||||
a7edc9250d13af36ac0108f259859c19 lib/core/subprocessng.py
|
a7edc9250d13af36ac0108f259859c19 lib/core/subprocessng.py
|
||||||
6306284edcccc185b2df085438572b0d lib/core/target.py
|
95f04c1c1d8c3998d86e1bdf0e12771c lib/core/target.py
|
||||||
72d499ca8d792e90a1ebfb2ad2341a51 lib/core/testing.py
|
72d499ca8d792e90a1ebfb2ad2341a51 lib/core/testing.py
|
||||||
de9922a29c71a235cb95a916ff925db2 lib/core/threads.py
|
de9922a29c71a235cb95a916ff925db2 lib/core/threads.py
|
||||||
c40758411bb0bd68764d78e0bb72bd0f lib/core/unescaper.py
|
c40758411bb0bd68764d78e0bb72bd0f lib/core/unescaper.py
|
||||||
af2d1810b6a7ebc61689a53c253ddbaa lib/core/update.py
|
9d395b143be295a143eb5c9b926f3569 lib/core/update.py
|
||||||
e772deb63270375e685fa5a7b775c382 lib/core/wordlist.py
|
e772deb63270375e685fa5a7b775c382 lib/core/wordlist.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/__init__.py
|
||||||
7620f1f4b8791e13c7184c06b5421754 lib/parse/banner.py
|
7620f1f4b8791e13c7184c06b5421754 lib/parse/banner.py
|
||||||
5e46fac7f824ba8ab8003a1cd47d8af3 lib/parse/cmdline.py
|
3dd11b8be62e15a9d54cf5f08c603ffc lib/parse/cmdline.py
|
||||||
fb2e2f05dde98caeac6ccf3e67192177 lib/parse/configfile.py
|
fb2e2f05dde98caeac6ccf3e67192177 lib/parse/configfile.py
|
||||||
3794ff139869f5ae8e81cfdbe5714f56 lib/parse/handler.py
|
3794ff139869f5ae8e81cfdbe5714f56 lib/parse/handler.py
|
||||||
aaad2a0d80f05eaebe52c71519b3dfc7 lib/parse/headers.py
|
6bab53ea9d75bc9bb8169d3e8f3f149f lib/parse/headers.py
|
||||||
33f21b11b7963062df8fa2292229df80 lib/parse/html.py
|
1bc6ddaeada0f2425fa9aae226854ca8 lib/parse/html.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/parse/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/parse/__init__.py
|
||||||
ec4e56bbb1349176b2a22e0b99ba6a55 lib/parse/payloads.py
|
f2af274126ce0a789027d35d367f2b9e lib/parse/payloads.py
|
||||||
492654567e72b6a14584651fcd9f16e6 lib/parse/sitemap.py
|
492654567e72b6a14584651fcd9f16e6 lib/parse/sitemap.py
|
||||||
30eed3a92a04ed2c29770e1b10d39dc0 lib/request/basicauthhandler.py
|
30eed3a92a04ed2c29770e1b10d39dc0 lib/request/basicauthhandler.py
|
||||||
596988f14408cde1a2d3b5c9f231873a lib/request/basic.py
|
2b81435f5a7519298c15c724e3194a0d lib/request/basic.py
|
||||||
c0cabedead14b8a23353b606672cff42 lib/request/comparison.py
|
c0cabedead14b8a23353b606672cff42 lib/request/comparison.py
|
||||||
5b7f216827207c085df96bb56ed5e600 lib/request/connect.py
|
039f0f7cf997856fa2f6e8d5d69f7ae9 lib/request/connect.py
|
||||||
dd4598675027fae99f2e2475b05986da lib/request/direct.py
|
dd4598675027fae99f2e2475b05986da lib/request/direct.py
|
||||||
2044fce3f4ffa268fcfaaf63241b1e64 lib/request/dns.py
|
2044fce3f4ffa268fcfaaf63241b1e64 lib/request/dns.py
|
||||||
eee965d781546d05f36cfd14af050913 lib/request/httpshandler.py
|
98535d0efca5551e712fcc4b34a3f772 lib/request/httpshandler.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/request/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/request/__init__.py
|
||||||
b188a11542a996276abbbc48913501c3 lib/request/inject.py
|
a5cbc19ee18bd4b848515eb3ea3291f0 lib/request/inject.py
|
||||||
aaf956c1e9855836c3f372e29d481393 lib/request/methodrequest.py
|
aaf956c1e9855836c3f372e29d481393 lib/request/methodrequest.py
|
||||||
51eeaa8abf5ba62aaaade66d46ff8b00 lib/request/pkihandler.py
|
51eeaa8abf5ba62aaaade66d46ff8b00 lib/request/pkihandler.py
|
||||||
aa7cb67139bbc57d67a728fd2abf80ed lib/request/rangehandler.py
|
2c3774b72586985719035b195f144d7b lib/request/rangehandler.py
|
||||||
aa809d825b33bea76a63ecd97cf7792c lib/request/redirecthandler.py
|
3cd9d17fc52bb62db29e0e24fc4d8a97 lib/request/redirecthandler.py
|
||||||
7f12d8f3b6665ed7053954bba70ff718 lib/request/templates.py
|
7f12d8f3b6665ed7053954bba70ff718 lib/request/templates.py
|
||||||
8d31425f36a7a9c093eb9bef44589593 lib/takeover/abstraction.py
|
747f9941a68361bd779ec760f71568e9 lib/takeover/abstraction.py
|
||||||
acc1db3667bf910b809eb279b60595eb lib/takeover/icmpsh.py
|
acc1db3667bf910b809eb279b60595eb lib/takeover/icmpsh.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/takeover/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/takeover/__init__.py
|
||||||
46ff5840b29531412bcaa05dac190413 lib/takeover/metasploit.py
|
46ff5840b29531412bcaa05dac190413 lib/takeover/metasploit.py
|
||||||
fb9e34d558293b5d6b9727f440712886 lib/takeover/registry.py
|
fb9e34d558293b5d6b9727f440712886 lib/takeover/registry.py
|
||||||
48575dde7bb867b7937769f569a98309 lib/takeover/udf.py
|
48575dde7bb867b7937769f569a98309 lib/takeover/udf.py
|
||||||
4584ac6ee5c13d4d395f0a7a21d8478c lib/takeover/web.py
|
f6f835e4190a55e42d13c1e7ca3f728f lib/takeover/web.py
|
||||||
f1decf0a987bd3a4bc757212cbe6a6c8 lib/takeover/xp_cmdshell.py
|
f1decf0a987bd3a4bc757212cbe6a6c8 lib/takeover/xp_cmdshell.py
|
||||||
2543e14cc7f6e239b49dd40f41bc34fa lib/techniques/blind/inference.py
|
4a7f231e597f754e9fcd116d13ad1a4d lib/techniques/blind/inference.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/techniques/blind/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/techniques/blind/__init__.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/techniques/dns/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/techniques/dns/__init__.py
|
||||||
855355a1a216f6b267a5f089028f1cd8 lib/techniques/dns/test.py
|
799faf9008527d2e9da9d923e50f685a lib/techniques/dns/test.py
|
||||||
733f3419ff2ea23f75bc24e36f4746d9 lib/techniques/dns/use.py
|
48a24f48da791e67309003fd5e8428cb lib/techniques/dns/use.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/techniques/error/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/techniques/error/__init__.py
|
||||||
f999f2e88dea9ac8831eb2f468478b5f lib/techniques/error/use.py
|
f5fb02487edaf9adaa81d54324c84f8f lib/techniques/error/use.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/techniques/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/techniques/__init__.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 lib/techniques/union/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 lib/techniques/union/__init__.py
|
||||||
a36be917cf86a5ee407c83d74567f324 lib/techniques/union/test.py
|
94d7a22bb6725a91e84ba2cd9973e96d lib/techniques/union/test.py
|
||||||
11ecf2effbe9f40b361843d546c3c521 lib/techniques/union/use.py
|
11ecf2effbe9f40b361843d546c3c521 lib/techniques/union/use.py
|
||||||
c552f8d924d962a26f2ded250bcea3b8 lib/utils/api.py
|
77ff35587af9e3dfde63b8327e230f9a lib/utils/api.py
|
||||||
37dfb641358669f62c2acedff241348b lib/utils/brute.py
|
37dfb641358669f62c2acedff241348b lib/utils/brute.py
|
||||||
31b1e7eb489eac837db6a2bc1dcb7da7 lib/utils/crawler.py
|
31b1e7eb489eac837db6a2bc1dcb7da7 lib/utils/crawler.py
|
||||||
de9620f03231d8329ee8434884b6bacd lib/utils/deps.py
|
de9620f03231d8329ee8434884b6bacd lib/utils/deps.py
|
||||||
635ed692ab141d428d0957b71b25c1aa lib/utils/getch.py
|
f7af65aa47329d021e2b2cc8521b42a4 lib/utils/getch.py
|
||||||
7af29f61302c8693cd6436d4b69e22d3 lib/utils/har.py
|
7af29f61302c8693cd6436d4b69e22d3 lib/utils/har.py
|
||||||
062e4e8fc43ac54305a75ddd0d482f81 lib/utils/hashdb.py
|
062e4e8fc43ac54305a75ddd0d482f81 lib/utils/hashdb.py
|
||||||
cc1cfe36057f1d9bbdcba1bcc03359f9 lib/utils/hash.py
|
cc1cfe36057f1d9bbdcba1bcc03359f9 lib/utils/hash.py
|
||||||
@@ -111,11 +113,11 @@ cc1cfe36057f1d9bbdcba1bcc03359f9 lib/utils/hash.py
|
|||||||
010d8327239d33af4ce9f25683cfc012 lib/utils/pivotdumptable.py
|
010d8327239d33af4ce9f25683cfc012 lib/utils/pivotdumptable.py
|
||||||
5cb78b0e60fd7fd84502d62cf85d2064 lib/utils/progress.py
|
5cb78b0e60fd7fd84502d62cf85d2064 lib/utils/progress.py
|
||||||
0ec5cec9d93d5ffd1eaeda6e942ecadf lib/utils/purge.py
|
0ec5cec9d93d5ffd1eaeda6e942ecadf lib/utils/purge.py
|
||||||
2e3e7213f50b52fc4d5a014a2ff8d163 lib/utils/search.py
|
2c5a655c8e94cbe2664ee497752ac1f2 lib/utils/search.py
|
||||||
236a8d9e596602b53f8e0aa09c30c0ef lib/utils/sqlalchemy.py
|
571884f530796534f03c49cf3f380a4c lib/utils/sqlalchemy.py
|
||||||
dcc25183c6bd85b172c87cfcbc305ab6 lib/utils/timeout.py
|
dcc25183c6bd85b172c87cfcbc305ab6 lib/utils/timeout.py
|
||||||
3d230e342a6c8d60ac7c68c556fbba9b lib/utils/versioncheck.py
|
fad14adffa8b640a15b06db955031695 lib/utils/versioncheck.py
|
||||||
7348ee704485651737ddbe3538271be9 lib/utils/xrange.py
|
e9e73cd6bd814dd7823a9da913cea61c lib/utils/xrange.py
|
||||||
b9d2761f47fec3d98b88311a263fd5db plugins/dbms/access/connector.py
|
b9d2761f47fec3d98b88311a263fd5db plugins/dbms/access/connector.py
|
||||||
3f1c50a1507d1c2f69c20c706230e2e2 plugins/dbms/access/enumeration.py
|
3f1c50a1507d1c2f69c20c706230e2e2 plugins/dbms/access/enumeration.py
|
||||||
fcc66fc377db3681f7890ec55675564b plugins/dbms/access/filesystem.py
|
fcc66fc377db3681f7890ec55675564b plugins/dbms/access/filesystem.py
|
||||||
@@ -123,10 +125,10 @@ c2428c5c73d049abf4442ec1b9404a25 plugins/dbms/access/fingerprint.py
|
|||||||
e657b1b7a295a38ac9ce515158164f00 plugins/dbms/access/__init__.py
|
e657b1b7a295a38ac9ce515158164f00 plugins/dbms/access/__init__.py
|
||||||
77686d7c7e287d5db0a9a87f2c7d4902 plugins/dbms/access/syntax.py
|
77686d7c7e287d5db0a9a87f2c7d4902 plugins/dbms/access/syntax.py
|
||||||
2f1d8706b51497623b2b59c07b552bdc plugins/dbms/access/takeover.py
|
2f1d8706b51497623b2b59c07b552bdc plugins/dbms/access/takeover.py
|
||||||
ead470b613e52e718a3062b63b518272 plugins/dbms/db2/connector.py
|
8df07c2805aceb7d6fb4add40de84795 plugins/dbms/db2/connector.py
|
||||||
0884e475c98701f8e698150aa122fb76 plugins/dbms/db2/enumeration.py
|
4deeda463003ab71e7d2f34a263b5bbf plugins/dbms/db2/enumeration.py
|
||||||
da9dccd1f9ec2cf1e53295125dd983a0 plugins/dbms/db2/filesystem.py
|
da9dccd1f9ec2cf1e53295125dd983a0 plugins/dbms/db2/filesystem.py
|
||||||
ba492b2aaa6432b5548c5a8fa5eec435 plugins/dbms/db2/fingerprint.py
|
b54dbf44590a5cbefb2b4f8e9a01a383 plugins/dbms/db2/fingerprint.py
|
||||||
95b35cbd859bbced44e7f8fd84486d75 plugins/dbms/db2/__init__.py
|
95b35cbd859bbced44e7f8fd84486d75 plugins/dbms/db2/__init__.py
|
||||||
82d96d8fcfd565129580260040555623 plugins/dbms/db2/syntax.py
|
82d96d8fcfd565129580260040555623 plugins/dbms/db2/syntax.py
|
||||||
25f0fb28e9defcab48a2e946fbb7550a plugins/dbms/db2/takeover.py
|
25f0fb28e9defcab48a2e946fbb7550a plugins/dbms/db2/takeover.py
|
||||||
@@ -144,7 +146,7 @@ b7d693a6f5f39fee0a65f2d7b0830c5e plugins/dbms/hsqldb/fingerprint.py
|
|||||||
fd369161778d6b48d7f1f7fc14dcdb5c plugins/dbms/hsqldb/__init__.py
|
fd369161778d6b48d7f1f7fc14dcdb5c plugins/dbms/hsqldb/__init__.py
|
||||||
4673ebfdce9859718c19e8a7765da8d3 plugins/dbms/hsqldb/syntax.py
|
4673ebfdce9859718c19e8a7765da8d3 plugins/dbms/hsqldb/syntax.py
|
||||||
7c0535736215ca612756cf589adb249b plugins/dbms/hsqldb/takeover.py
|
7c0535736215ca612756cf589adb249b plugins/dbms/hsqldb/takeover.py
|
||||||
9ceb9430031a26ecebe13ea49cb2a5fa plugins/dbms/informix/connector.py
|
97dac442190bd4ffac3ba292e2abfd4c plugins/dbms/informix/connector.py
|
||||||
c54d70e4847c6327bd3110c4d8723b04 plugins/dbms/informix/enumeration.py
|
c54d70e4847c6327bd3110c4d8723b04 plugins/dbms/informix/enumeration.py
|
||||||
da9dccd1f9ec2cf1e53295125dd983a0 plugins/dbms/informix/filesystem.py
|
da9dccd1f9ec2cf1e53295125dd983a0 plugins/dbms/informix/filesystem.py
|
||||||
35eac2f3837a72940eb50753dc4566e5 plugins/dbms/informix/fingerprint.py
|
35eac2f3837a72940eb50753dc4566e5 plugins/dbms/informix/fingerprint.py
|
||||||
@@ -159,35 +161,35 @@ ffd26f64142226d0b1ed1d70f7f294c0 plugins/dbms/maxdb/filesystem.py
|
|||||||
4321d7018f5121343460ebfd83bb69be plugins/dbms/maxdb/__init__.py
|
4321d7018f5121343460ebfd83bb69be plugins/dbms/maxdb/__init__.py
|
||||||
e7d44671ae26c0bcd5fe8448be070bbd plugins/dbms/maxdb/syntax.py
|
e7d44671ae26c0bcd5fe8448be070bbd plugins/dbms/maxdb/syntax.py
|
||||||
bf7842bb291e2297c3c8d1023eb3e550 plugins/dbms/maxdb/takeover.py
|
bf7842bb291e2297c3c8d1023eb3e550 plugins/dbms/maxdb/takeover.py
|
||||||
9e64e67291a4c369bad8b8cf2cfa722a plugins/dbms/mssqlserver/connector.py
|
decc645344bb93aca504a71ba2e4cad4 plugins/dbms/mssqlserver/connector.py
|
||||||
f1f1541a54faf67440179fa521f99849 plugins/dbms/mssqlserver/enumeration.py
|
f1f1541a54faf67440179fa521f99849 plugins/dbms/mssqlserver/enumeration.py
|
||||||
177e1d55d28ed3190bc0079b8126c6be plugins/dbms/mssqlserver/filesystem.py
|
177e1d55d28ed3190bc0079b8126c6be plugins/dbms/mssqlserver/filesystem.py
|
||||||
51eb413ac62408965be20a812f2412c8 plugins/dbms/mssqlserver/fingerprint.py
|
08914da79141713bd69a25c3cc7f06a8 plugins/dbms/mssqlserver/fingerprint.py
|
||||||
affef90b1442285da7e89e46603c502e plugins/dbms/mssqlserver/__init__.py
|
f25c50a95e5390ecd32be5a011637349 plugins/dbms/mssqlserver/__init__.py
|
||||||
612be1929108e7b4512a49a4a3837bbc plugins/dbms/mssqlserver/syntax.py
|
612be1929108e7b4512a49a4a3837bbc plugins/dbms/mssqlserver/syntax.py
|
||||||
08fe8ac7acdfc0e3168b5b069a7c73bf plugins/dbms/mssqlserver/takeover.py
|
3c0845fa526e1bb7bbe636fcfcbcc4a6 plugins/dbms/mssqlserver/takeover.py
|
||||||
f6e1f3f09f32b9cb2ca11c016d373423 plugins/dbms/mysql/connector.py
|
f6e1f3f09f32b9cb2ca11c016d373423 plugins/dbms/mysql/connector.py
|
||||||
445164daf59b890aeacc968af58fcb53 plugins/dbms/mysql/enumeration.py
|
445164daf59b890aeacc968af58fcb53 plugins/dbms/mysql/enumeration.py
|
||||||
4578fa29f04d0a75499f9668466ded07 plugins/dbms/mysql/filesystem.py
|
4578fa29f04d0a75499f9668466ded07 plugins/dbms/mysql/filesystem.py
|
||||||
fcbf7ff279c527b4aca0dac94c28d20c plugins/dbms/mysql/fingerprint.py
|
4e23494d0a8f41c22ec3861fb404e9f7 plugins/dbms/mysql/fingerprint.py
|
||||||
30065993f8300994e4658634121609e9 plugins/dbms/mysql/__init__.py
|
30065993f8300994e4658634121609e9 plugins/dbms/mysql/__init__.py
|
||||||
0e2adbee217f5b94dcc124d24b8dde99 plugins/dbms/mysql/syntax.py
|
0e2adbee217f5b94dcc124d24b8dde99 plugins/dbms/mysql/syntax.py
|
||||||
403591e638b6bfdb840d52bd3138ee56 plugins/dbms/mysql/takeover.py
|
403591e638b6bfdb840d52bd3138ee56 plugins/dbms/mysql/takeover.py
|
||||||
999cb8d0d52820d30bdd4b3d658a765d plugins/dbms/oracle/connector.py
|
f772070dba85976a7894dac5046b93ea plugins/dbms/oracle/connector.py
|
||||||
e1ffee36fd18f33f34bb4bac4ae43f14 plugins/dbms/oracle/enumeration.py
|
e1ffee36fd18f33f34bb4bac4ae43f14 plugins/dbms/oracle/enumeration.py
|
||||||
c326b0d8bed92be67888b0242f565ac8 plugins/dbms/oracle/filesystem.py
|
c326b0d8bed92be67888b0242f565ac8 plugins/dbms/oracle/filesystem.py
|
||||||
e16cbf8abda91a906ca7bafb81d8866e plugins/dbms/oracle/fingerprint.py
|
538395c0e5ccb1b6befc17f129f45f29 plugins/dbms/oracle/fingerprint.py
|
||||||
9cbce3d3747c67f18e65f9c1eb910b0e plugins/dbms/oracle/__init__.py
|
9cbce3d3747c67f18e65f9c1eb910b0e plugins/dbms/oracle/__init__.py
|
||||||
5c2f1611c3ceface38a7e95650391ae6 plugins/dbms/oracle/syntax.py
|
5c2f1611c3ceface38a7e95650391ae6 plugins/dbms/oracle/syntax.py
|
||||||
bcdbd9c04d7d5a911e0e31abe1a24f0f plugins/dbms/oracle/takeover.py
|
bcdbd9c04d7d5a911e0e31abe1a24f0f plugins/dbms/oracle/takeover.py
|
||||||
f99c23db4ee6a6b8c0edbf684d360ad3 plugins/dbms/postgresql/connector.py
|
f99c23db4ee6a6b8c0edbf684d360ad3 plugins/dbms/postgresql/connector.py
|
||||||
7cdb821884e5f15084d1bea7f8a50574 plugins/dbms/postgresql/enumeration.py
|
7cdb821884e5f15084d1bea7f8a50574 plugins/dbms/postgresql/enumeration.py
|
||||||
c8bb829d45752b98e6a03817b92e0fe5 plugins/dbms/postgresql/filesystem.py
|
c8bb829d45752b98e6a03817b92e0fe5 plugins/dbms/postgresql/filesystem.py
|
||||||
603d533d924498378eccba4f0f196be6 plugins/dbms/postgresql/fingerprint.py
|
29560cf78211888802c6e5c8681e7d71 plugins/dbms/postgresql/fingerprint.py
|
||||||
470860d3e85d11a67f2220bffaa415e7 plugins/dbms/postgresql/__init__.py
|
470860d3e85d11a67f2220bffaa415e7 plugins/dbms/postgresql/__init__.py
|
||||||
20e6f48f496348be45f3402ebc265dbb plugins/dbms/postgresql/syntax.py
|
20e6f48f496348be45f3402ebc265dbb plugins/dbms/postgresql/syntax.py
|
||||||
1287acf330da86a93c8e64aff46e3b65 plugins/dbms/postgresql/takeover.py
|
1287acf330da86a93c8e64aff46e3b65 plugins/dbms/postgresql/takeover.py
|
||||||
3009438ba259ca159c5ce9799f27dec1 plugins/dbms/sqlite/connector.py
|
80a2083a4fb7809d310c3d5ecc94e3c5 plugins/dbms/sqlite/connector.py
|
||||||
5194556e6b1575b1349f8ccfd773952b plugins/dbms/sqlite/enumeration.py
|
5194556e6b1575b1349f8ccfd773952b plugins/dbms/sqlite/enumeration.py
|
||||||
90fa97b84998a01dba7cc8c3329a1223 plugins/dbms/sqlite/filesystem.py
|
90fa97b84998a01dba7cc8c3329a1223 plugins/dbms/sqlite/filesystem.py
|
||||||
ed52c198f3346ceabdef676e9f5d3c0f plugins/dbms/sqlite/fingerprint.py
|
ed52c198f3346ceabdef676e9f5d3c0f plugins/dbms/sqlite/fingerprint.py
|
||||||
@@ -203,7 +205,7 @@ a3db8618eed5bb2807b6f77605cba9cc plugins/dbms/sybase/__init__.py
|
|||||||
79f6c7017db4ded8f74a0117188836ff plugins/dbms/sybase/takeover.py
|
79f6c7017db4ded8f74a0117188836ff plugins/dbms/sybase/takeover.py
|
||||||
34d181a7086d6dfc7e72ae5f8a4cfe0f plugins/generic/connector.py
|
34d181a7086d6dfc7e72ae5f8a4cfe0f plugins/generic/connector.py
|
||||||
e6cd1c5a5244d83396b401f7db43d323 plugins/generic/custom.py
|
e6cd1c5a5244d83396b401f7db43d323 plugins/generic/custom.py
|
||||||
79c6dbcb7e6ad5e993a44aa52fdc36ed plugins/generic/databases.py
|
156c227dbe765da3d0fd2976fbe18d8b plugins/generic/databases.py
|
||||||
4e2b366bb9cfdaaed719b219913357c6 plugins/generic/entries.py
|
4e2b366bb9cfdaaed719b219913357c6 plugins/generic/entries.py
|
||||||
d82f2c78c1d4d7c6487e94fd3a68a908 plugins/generic/enumeration.py
|
d82f2c78c1d4d7c6487e94fd3a68a908 plugins/generic/enumeration.py
|
||||||
0c8abe66a78edca0660bfb8049d109e2 plugins/generic/filesystem.py
|
0c8abe66a78edca0660bfb8049d109e2 plugins/generic/filesystem.py
|
||||||
@@ -224,12 +226,13 @@ ec2ba8c757ac96425dcd2b97970edd3a shell/stagers/stager.asp_
|
|||||||
0c48ddb1feb7e38a951ef05a0d48e032 shell/stagers/stager.jsp_
|
0c48ddb1feb7e38a951ef05a0d48e032 shell/stagers/stager.jsp_
|
||||||
2f9e459a4cf6a58680978cdce5ff7971 shell/stagers/stager.php_
|
2f9e459a4cf6a58680978cdce5ff7971 shell/stagers/stager.php_
|
||||||
4eaeef94314956e4517e5310a28d579a sqlmapapi.py
|
4eaeef94314956e4517e5310a28d579a sqlmapapi.py
|
||||||
5d1d27e7237584c4499ee9a3e698e384 sqlmap.py
|
b6e9d67cafb85ff2c3fde165fc577a8d sqlmap.py
|
||||||
|
1a1e3a78ded58b240c9dbb1b17996acf tamper/0x2char.py
|
||||||
4c3b8a7daa4bff52e01d4168be0eedbe tamper/apostrophemask.py
|
4c3b8a7daa4bff52e01d4168be0eedbe tamper/apostrophemask.py
|
||||||
4115a55b8aba464723d645b7d3156b6e tamper/apostrophenullencode.py
|
4115a55b8aba464723d645b7d3156b6e tamper/apostrophenullencode.py
|
||||||
d7e9a979eff4d7315d804a181e66fc93 tamper/appendnullbyte.py
|
d7e9a979eff4d7315d804a181e66fc93 tamper/appendnullbyte.py
|
||||||
0298d81e9dfac7ff18a5236c0f1d84b6 tamper/base64encode.py
|
0298d81e9dfac7ff18a5236c0f1d84b6 tamper/base64encode.py
|
||||||
4d44f868c6c97ced29e306347ce5d650 tamper/between.py
|
9a3da4aa7b220448aa3ecbb92f68330f tamper/between.py
|
||||||
e1d2329adc6ca89828a2eaec2951806c tamper/bluecoat.py
|
e1d2329adc6ca89828a2eaec2951806c tamper/bluecoat.py
|
||||||
e3cdf13caedb4682bee3ff8fac103606 tamper/chardoubleencode.py
|
e3cdf13caedb4682bee3ff8fac103606 tamper/chardoubleencode.py
|
||||||
3b2f68476fbcf8223199e8dd4ec14b64 tamper/charencode.py
|
3b2f68476fbcf8223199e8dd4ec14b64 tamper/charencode.py
|
||||||
@@ -237,7 +240,7 @@ b502023ac6c48e49e652ba524b8e18cc tamper/charunicodeencode.py
|
|||||||
2c2b38974dc773568de7e7d771d7042c tamper/charunicodeescape.py
|
2c2b38974dc773568de7e7d771d7042c tamper/charunicodeescape.py
|
||||||
6a395de07b60f47d9474ace0a98c160f tamper/commalesslimit.py
|
6a395de07b60f47d9474ace0a98c160f tamper/commalesslimit.py
|
||||||
211bb8fa36a6ecb42b719c951c362851 tamper/commalessmid.py
|
211bb8fa36a6ecb42b719c951c362851 tamper/commalessmid.py
|
||||||
19acfde79c9a2d8458e15182f5b73d71 tamper/commentbeforeparentheses.py
|
6082358eb328d1cdd4587e73c95bbefc tamper/commentbeforeparentheses.py
|
||||||
334e4a2485b3a1bbc1734823b93ea694 tamper/concat2concatws.py
|
334e4a2485b3a1bbc1734823b93ea694 tamper/concat2concatws.py
|
||||||
dcdc433fe946f1b9005bcd427a951dd6 tamper/equaltolike.py
|
dcdc433fe946f1b9005bcd427a951dd6 tamper/equaltolike.py
|
||||||
06df880df5d8749963f5562f60fd1637 tamper/escapequotes.py
|
06df880df5d8749963f5562f60fd1637 tamper/escapequotes.py
|
||||||
@@ -246,13 +249,13 @@ dcdc433fe946f1b9005bcd427a951dd6 tamper/equaltolike.py
|
|||||||
9d8c350cbb90d4b21ec9c9db184a213a tamper/htmlencode.py
|
9d8c350cbb90d4b21ec9c9db184a213a tamper/htmlencode.py
|
||||||
3f79551baf811ff70b2ba8795a2064be tamper/ifnull2casewhenisnull.py
|
3f79551baf811ff70b2ba8795a2064be tamper/ifnull2casewhenisnull.py
|
||||||
e2c2b6a67546b36983a72f129a817ec0 tamper/ifnull2ifisnull.py
|
e2c2b6a67546b36983a72f129a817ec0 tamper/ifnull2ifisnull.py
|
||||||
91c92ee203e7e619cb547643883924ca tamper/informationschemacomment.py
|
21665e68ef9f91b2395e81d2f341412d tamper/informationschemacomment.py
|
||||||
1e5532ede194ac9c083891c2f02bca93 tamper/__init__.py
|
1e5532ede194ac9c083891c2f02bca93 tamper/__init__.py
|
||||||
2dc49bcd6c55f4e2322b07fa92685356 tamper/least.py
|
2dc49bcd6c55f4e2322b07fa92685356 tamper/least.py
|
||||||
1834b5409c449d2ea1b70a5038fed9eb tamper/lowercase.py
|
1834b5409c449d2ea1b70a5038fed9eb tamper/lowercase.py
|
||||||
de4c83d33968a0cbf00cdfd8d35deddc tamper/modsecurityversioned.py
|
de4c83d33968a0cbf00cdfd8d35deddc tamper/modsecurityversioned.py
|
||||||
39981d5d6cb84aca950458739102bb07 tamper/modsecurityzeroversioned.py
|
39981d5d6cb84aca950458739102bb07 tamper/modsecurityzeroversioned.py
|
||||||
b4cadf2ddcdc0598c9a3bf24521a2fa1 tamper/multiplespaces.py
|
5ee5147612ebe4769a67a8e2305d62f7 tamper/multiplespaces.py
|
||||||
be757e4c9a6fb36af7b9a8c444fddb05 tamper/nonrecursivereplacement.py
|
be757e4c9a6fb36af7b9a8c444fddb05 tamper/nonrecursivereplacement.py
|
||||||
e298e486c06bb39d81f10d61a5c4ceec tamper/overlongutf8more.py
|
e298e486c06bb39d81f10d61a5c4ceec tamper/overlongutf8more.py
|
||||||
b9f698556f8333d9fa6eadaab44a77ab tamper/overlongutf8.py
|
b9f698556f8333d9fa6eadaab44a77ab tamper/overlongutf8.py
|
||||||
@@ -355,7 +358,7 @@ ff80a22ee858f5331b0c088efa98b3ff thirdparty/prettyprint/prettyprint.py
|
|||||||
5c70f8e5f7353aedc6d8d21d4fb72b37 thirdparty/pydes/__init__.py
|
5c70f8e5f7353aedc6d8d21d4fb72b37 thirdparty/pydes/__init__.py
|
||||||
a7f735641c5b695f3d6220fe7c91b030 thirdparty/pydes/pyDes.py
|
a7f735641c5b695f3d6220fe7c91b030 thirdparty/pydes/pyDes.py
|
||||||
d41d8cd98f00b204e9800998ecf8427e thirdparty/socks/__init__.py
|
d41d8cd98f00b204e9800998ecf8427e thirdparty/socks/__init__.py
|
||||||
74fcae36f5a2cc440c1717ae8e3f64c4 thirdparty/socks/socks.py
|
afd97f26bffa0532ee4eb4f5f8ec1ab7 thirdparty/socks/socks.py
|
||||||
d41d8cd98f00b204e9800998ecf8427e thirdparty/termcolor/__init__.py
|
d41d8cd98f00b204e9800998ecf8427e thirdparty/termcolor/__init__.py
|
||||||
ea649aae139d8551af513769dd913dbf thirdparty/termcolor/termcolor.py
|
ea649aae139d8551af513769dd913dbf thirdparty/termcolor/termcolor.py
|
||||||
bf55909ad163b58236e44b86e8441b26 thirdparty/wininetpton/__init__.py
|
bf55909ad163b58236e44b86e8441b26 thirdparty/wininetpton/__init__.py
|
||||||
@@ -388,7 +391,7 @@ ca3ab78d6ed53b7f2c07ed2530d47efd udf/postgresql/windows/32/8.4/lib_postgresqlud
|
|||||||
0d3fe0293573a4453463a0fa5a081de1 udf/postgresql/windows/32/9.0/lib_postgresqludf_sys.dll_
|
0d3fe0293573a4453463a0fa5a081de1 udf/postgresql/windows/32/9.0/lib_postgresqludf_sys.dll_
|
||||||
336d0b0d2be333f5a6184042c85464fd waf/360.py
|
336d0b0d2be333f5a6184042c85464fd waf/360.py
|
||||||
667cacdcd4ba650c9a436f081a79cd64 waf/airlock.py
|
667cacdcd4ba650c9a436f081a79cd64 waf/airlock.py
|
||||||
003cc986b2f5899fe3c85b6309c4b556 waf/anquanbao.py
|
7da7970b45512b0233450dbd8088fde0 waf/anquanbao.py
|
||||||
b61329e8f8bdbf5625f9520ec010af1f waf/armor.py
|
b61329e8f8bdbf5625f9520ec010af1f waf/armor.py
|
||||||
dec64f18c23962d279cc1cde6469afed waf/asm.py
|
dec64f18c23962d279cc1cde6469afed waf/asm.py
|
||||||
6ea7b4ff5f111acb0b24186ef82c3f2d waf/aws.py
|
6ea7b4ff5f111acb0b24186ef82c3f2d waf/aws.py
|
||||||
@@ -398,6 +401,7 @@ ef722d062564def381b1f96f5faadee3 waf/baidu.py
|
|||||||
6a2834daf767491d3331bd31e946d540 waf/binarysec.py
|
6a2834daf767491d3331bd31e946d540 waf/binarysec.py
|
||||||
41e399dbfe7b904d5aacfb37d85e1fbf waf/blockdos.py
|
41e399dbfe7b904d5aacfb37d85e1fbf waf/blockdos.py
|
||||||
2f3bbf43be94d4e9ffe9f80e8483d62f waf/ciscoacexml.py
|
2f3bbf43be94d4e9ffe9f80e8483d62f waf/ciscoacexml.py
|
||||||
|
ba84f296cb52f5e78a0670b98d7763fa waf/cloudbric.py
|
||||||
21b8203fdaaaac3cb7c84fa4dc0627f6 waf/cloudflare.py
|
21b8203fdaaaac3cb7c84fa4dc0627f6 waf/cloudflare.py
|
||||||
b16b1c15532103346d5e2f5b8bd1ed36 waf/cloudfront.py
|
b16b1c15532103346d5e2f5b8bd1ed36 waf/cloudfront.py
|
||||||
ac96f34c254951d301973617064eb1b5 waf/comodo.py
|
ac96f34c254951d301973617064eb1b5 waf/comodo.py
|
||||||
@@ -405,7 +409,7 @@ ac96f34c254951d301973617064eb1b5 waf/comodo.py
|
|||||||
1538b661e35843074f4599be93b3fae9 waf/denyall.py
|
1538b661e35843074f4599be93b3fae9 waf/denyall.py
|
||||||
aade02eb8f6a4a214a53db0fd0f2aae6 waf/dosarrest.py
|
aade02eb8f6a4a214a53db0fd0f2aae6 waf/dosarrest.py
|
||||||
357cbc0a17a44e4f64062b799c718e0b waf/dotdefender.py
|
357cbc0a17a44e4f64062b799c718e0b waf/dotdefender.py
|
||||||
ad20145a12cff50d49085ed06c8e422b waf/edgecast.py
|
7ec3f2a90914b501100685aa66aadf02 waf/edgecast.py
|
||||||
954bebd4a246d8b88794de00ccaecd3b waf/expressionengine.py
|
954bebd4a246d8b88794de00ccaecd3b waf/expressionengine.py
|
||||||
a2ce6cde682f78e1fd561dc40611877e waf/fortiweb.py
|
a2ce6cde682f78e1fd561dc40611877e waf/fortiweb.py
|
||||||
eb56ac34775cc3c5f721ec967d04b283 waf/generic.py
|
eb56ac34775cc3c5f721ec967d04b283 waf/generic.py
|
||||||
@@ -416,7 +420,7 @@ eb56ac34775cc3c5f721ec967d04b283 waf/generic.py
|
|||||||
5a5c9452b9779bf39c208ebe26c98fdb waf/jiasule.py
|
5a5c9452b9779bf39c208ebe26c98fdb waf/jiasule.py
|
||||||
898f53c12133da3e946301f4aa97d538 waf/knownsec.py
|
898f53c12133da3e946301f4aa97d538 waf/knownsec.py
|
||||||
81e6bf619c7bb73c4b62e2439e60e95a waf/kona.py
|
81e6bf619c7bb73c4b62e2439e60e95a waf/kona.py
|
||||||
4906ab7bea7f6715f5796933f1a89381 waf/modsecurity.py
|
b17a154fe7959619eaafffa60e14199f waf/modsecurity.py
|
||||||
d09a50713daf3c0a2594ed4f50c57adb waf/naxsi.py
|
d09a50713daf3c0a2594ed4f50c57adb waf/naxsi.py
|
||||||
bf573d01d56e585f4ad57132bc594934 waf/netcontinuum.py
|
bf573d01d56e585f4ad57132bc594934 waf/netcontinuum.py
|
||||||
cb2f1516867684042f580e02138463de waf/netscaler.py
|
cb2f1516867684042f580e02138463de waf/netscaler.py
|
||||||
@@ -441,7 +445,7 @@ dffa9cebad777308714aaf83b71635b4 waf/teros.py
|
|||||||
b37210459a13de40bf07722c4d032c33 waf/trafficshield.py
|
b37210459a13de40bf07722c4d032c33 waf/trafficshield.py
|
||||||
fe01932df9acea7f6d23f03c6b698646 waf/urlscan.py
|
fe01932df9acea7f6d23f03c6b698646 waf/urlscan.py
|
||||||
a687449cd4e45f69e33b13d41e021480 waf/uspses.py
|
a687449cd4e45f69e33b13d41e021480 waf/uspses.py
|
||||||
814fcc4ab087fb181ddad5fc12bd3d53 waf/varnish.py
|
f3a81da13ee098e94edd965ea4b37b04 waf/varnish.py
|
||||||
20840afc269920826deac2b6c00d6b9c waf/wallarm.py
|
20840afc269920826deac2b6c00d6b9c waf/wallarm.py
|
||||||
11205abf397ae9072adc3234b656ade9 waf/watchguard.py
|
11205abf397ae9072adc3234b656ade9 waf/watchguard.py
|
||||||
9bf34539f382987490d2239d8ef0a651 waf/webappsecure.py
|
9bf34539f382987490d2239d8ef0a651 waf/webappsecure.py
|
||||||
@@ -464,10 +468,10 @@ d989813ee377252bca2103cea524c06b xml/banner/sharepoint.xml
|
|||||||
fb93505ef0ab3b4a20900f3e5625260d xml/boundaries.xml
|
fb93505ef0ab3b4a20900f3e5625260d xml/boundaries.xml
|
||||||
0d0d4bd0e06c99dd8eb4f92acc25caf3 xml/errors.xml
|
0d0d4bd0e06c99dd8eb4f92acc25caf3 xml/errors.xml
|
||||||
a279656ea3fcb85c727249b02f828383 xml/livetests.xml
|
a279656ea3fcb85c727249b02f828383 xml/livetests.xml
|
||||||
14a2abeb88b00ab489359d0dd7a3017f xml/payloads/boolean_blind.xml
|
3318571fac8df058f19ea85780606643 xml/payloads/boolean_blind.xml
|
||||||
b5b8b0aebce810e6cdda1b7106c96427 xml/payloads/error_based.xml
|
b5b8b0aebce810e6cdda1b7106c96427 xml/payloads/error_based.xml
|
||||||
06b1a210b190d52477a9d492443725b5 xml/payloads/inline_query.xml
|
06b1a210b190d52477a9d492443725b5 xml/payloads/inline_query.xml
|
||||||
3194e2688a7576e1f877d5b137f7c260 xml/payloads/stacked_queries.xml
|
3194e2688a7576e1f877d5b137f7c260 xml/payloads/stacked_queries.xml
|
||||||
c2d8dd03db5a663e79eabb4495dd0723 xml/payloads/time_blind.xml
|
c2d8dd03db5a663e79eabb4495dd0723 xml/payloads/time_blind.xml
|
||||||
ac649aff0e7db413e4937e446e398736 xml/payloads/union_query.xml
|
ac649aff0e7db413e4937e446e398736 xml/payloads/union_query.xml
|
||||||
186808373a45316a45ad5f6ca8d90ff3 xml/queries.xml
|
a5eecbca03800851635817e0ca832a92 xml/queries.xml
|
||||||
|
|||||||
@@ -17,7 +17,7 @@ def detect(get_page):
|
|||||||
for vector in WAF_ATTACK_VECTORS:
|
for vector in WAF_ATTACK_VECTORS:
|
||||||
page, headers, code = get_page(get=vector)
|
page, headers, code = get_page(get=vector)
|
||||||
retval = re.search(r"MISS", headers.get("X-Powered-By-Anquanbao", ""), re.I) is not None
|
retval = re.search(r"MISS", headers.get("X-Powered-By-Anquanbao", ""), re.I) is not None
|
||||||
retval |= code == 405 and "/aqb_cc/error/" in (page or "")
|
retval |= code == 405 and any(_ in (page or "") for _ in ("/aqb_cc/error/", "hidden_intercept_time"))
|
||||||
if retval:
|
if retval:
|
||||||
break
|
break
|
||||||
|
|
||||||
|
|||||||
19
waf/cloudbric.py
Normal file
19
waf/cloudbric.py
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
|
||||||
|
"""
|
||||||
|
Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
|
||||||
|
See the file 'LICENSE' for copying permission
|
||||||
|
"""
|
||||||
|
|
||||||
|
from lib.core.settings import WAF_ATTACK_VECTORS
|
||||||
|
|
||||||
|
__product__ = "Cloudbric Web Application Firewall (Cloudbric)"
|
||||||
|
|
||||||
|
def detect(get_page):
|
||||||
|
retval = False
|
||||||
|
|
||||||
|
for vector in WAF_ATTACK_VECTORS:
|
||||||
|
page, headers, code = get_page(get=vector)
|
||||||
|
retval = code >= 400 and all(_ in (page or "") for _ in ("Cloudbric", "Malicious Code Detected"))
|
||||||
|
|
||||||
|
return retval
|
||||||
@@ -10,7 +10,7 @@ import re
|
|||||||
from lib.core.enums import HTTP_HEADER
|
from lib.core.enums import HTTP_HEADER
|
||||||
from lib.core.settings import WAF_ATTACK_VECTORS
|
from lib.core.settings import WAF_ATTACK_VECTORS
|
||||||
|
|
||||||
__product__ = "EdgeCast WAF (Verizon)"
|
__product__ = "EdgeCast Web Application Firewall (Verizon)"
|
||||||
|
|
||||||
def detect(get_page):
|
def detect(get_page):
|
||||||
retval = False
|
retval = False
|
||||||
|
|||||||
@@ -18,7 +18,7 @@ def detect(get_page):
|
|||||||
for vector in WAF_ATTACK_VECTORS:
|
for vector in WAF_ATTACK_VECTORS:
|
||||||
page, headers, code = get_page(get=vector)
|
page, headers, code = get_page(get=vector)
|
||||||
retval = re.search(r"Mod_Security|NOYB", headers.get(HTTP_HEADER.SERVER, ""), re.I) is not None
|
retval = re.search(r"Mod_Security|NOYB", headers.get(HTTP_HEADER.SERVER, ""), re.I) is not None
|
||||||
retval |= "This error was generated by Mod_Security" in (page or "")
|
retval |= any(_ in (page or "") for _ in ("This error was generated by Mod_Security", "One or more things in your request were suspicious", "rules of the mod_security module"))
|
||||||
if retval:
|
if retval:
|
||||||
break
|
break
|
||||||
|
|
||||||
|
|||||||
@@ -21,6 +21,7 @@ def detect(get_page):
|
|||||||
retval |= re.search(r"varnish\Z", headers.get(HTTP_HEADER.VIA, ""), re.I) is not None
|
retval |= re.search(r"varnish\Z", headers.get(HTTP_HEADER.VIA, ""), re.I) is not None
|
||||||
retval |= re.search(r"varnish", headers.get(HTTP_HEADER.SERVER, ""), re.I) is not None
|
retval |= re.search(r"varnish", headers.get(HTTP_HEADER.SERVER, ""), re.I) is not None
|
||||||
retval |= code == 404 and re.search(r"\bXID: \d+", page or "") is not None
|
retval |= code == 404 and re.search(r"\bXID: \d+", page or "") is not None
|
||||||
|
retval |= code >= 400 and "Request rejected by xVarnish-WAF" in (page or "")
|
||||||
if retval:
|
if retval:
|
||||||
break
|
break
|
||||||
|
|
||||||
|
|||||||
@@ -1361,6 +1361,23 @@ Tag: <test>
|
|||||||
<dbms>SAP MaxDB</dbms>
|
<dbms>SAP MaxDB</dbms>
|
||||||
</details>
|
</details>
|
||||||
</test>
|
</test>
|
||||||
|
|
||||||
|
<!-- Works in MySQL, Oracle, etc. -->
|
||||||
|
<test>
|
||||||
|
<title>HAVING boolean-based blind - WHERE, GROUP BY clause</title>
|
||||||
|
<stype>1</stype>
|
||||||
|
<level>3</level>
|
||||||
|
<risk>1</risk>
|
||||||
|
<clause>1,2</clause>
|
||||||
|
<where>1</where>
|
||||||
|
<vector>HAVING [INFERENCE]</vector>
|
||||||
|
<request>
|
||||||
|
<payload>HAVING [RANDNUM]=[RANDNUM]</payload>
|
||||||
|
</request>
|
||||||
|
<response>
|
||||||
|
<comparison>HAVING [RANDNUM]=[RANDNUM1]</comparison>
|
||||||
|
</response>
|
||||||
|
</test>
|
||||||
<!-- End of boolean-based blind tests - ORDER BY, GROUP BY clause -->
|
<!-- End of boolean-based blind tests - ORDER BY, GROUP BY clause -->
|
||||||
|
|
||||||
<!-- Boolean-based blind tests - Stacked queries -->
|
<!-- Boolean-based blind tests - Stacked queries -->
|
||||||
|
|||||||
@@ -711,7 +711,7 @@
|
|||||||
<inband query="SELECT table_schem FROM INFORMATION_SCHEMA.SYSTEM_SCHEMAS WHERE %s" condition="table_schem"/>
|
<inband query="SELECT table_schem FROM INFORMATION_SCHEMA.SYSTEM_SCHEMAS WHERE %s" condition="table_schem"/>
|
||||||
</search_db>
|
</search_db>
|
||||||
<search_table>
|
<search_table>
|
||||||
<blind query="SELECT DISTINCT(table_schem) FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE %s" count="SELECT COUNT(DISTINCT(table_schem)) FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE %s" condition="table_name" condition2="table_schem"/>
|
<blind query="SELECT DISTINCT(table_schem) FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE %s" query2="SELECT DISTINCT(table_name) FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE table_schem='%s'" count="SELECT COUNT(DISTINCT(table_schem)) FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE %s" count2="SELECT COUNT(DISTINCT(table_name)) FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE table_schem='%s'" condition="table_name" condition2="table_schem"/>
|
||||||
<inband query="SELECT table_schem,table_name FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE %s" condition="table_name" condition2="table_schem"/>
|
<inband query="SELECT table_schem,table_name FROM INFORMATION_SCHEMA.SYSTEM_TABLES WHERE %s" condition="table_name" condition2="table_schem"/>
|
||||||
</search_table>
|
</search_table>
|
||||||
<search_column>
|
<search_column>
|
||||||
|
|||||||
Reference in New Issue
Block a user